45 resultados para geração de código
Resumo:
Formal methods and software testing are tools to obtain and control software quality. When used together, they provide mechanisms for software specification, verification and error detection. Even though formal methods allow software to be mathematically verified, they are not enough to assure that a system is free of faults, thus, software testing techniques are necessary to complement the process of verification and validation of a system. Model Based Testing techniques allow tests to be generated from other software artifacts such as specifications and abstract models. Using formal specifications as basis for test creation, we can generate better quality tests, because these specifications are usually precise and free of ambiguity. Fernanda Souza (2009) proposed a method to define test cases from B Method specifications. This method used information from the machine s invariant and the operation s precondition to define positive and negative test cases for an operation, using equivalent class partitioning and boundary value analysis based techniques. However, the method proposed in 2009 was not automated and had conceptual deficiencies like, for instance, it did not fit in a well defined coverage criteria classification. We started our work with a case study that applied the method in an example of B specification from the industry. Based in this case study we ve obtained subsidies to improve it. In our work we evolved the proposed method, rewriting it and adding characteristics to make it compatible with a test classification used by the community. We also improved the method to support specifications structured in different components, to use information from the operation s behavior on the test case generation process and to use new coverage criterias. Besides, we have implemented a tool to automate the method and we have submitted it to more complex case studies
Resumo:
The game industry has been experiencing a consistent increase in production costs of games lately. Part of this increase refers to the current trend of having bigger, more interactive and replayable environments. This trend translates to an increase in both team size and development time, which makes game development a even more risky investment and may reduce innovation in the area. As a possible solution to this problem, the scientific community is focusing on the generation of procedural content and, more specifically, on procedurally generated levels. Given the great diversity and complexity of games, most works choose to deal with a specific genre, platform games being one of the most studied. This work aims at proposing a procedural level generation method for platform/adventure games, a fairly more complex genre than most classic platformers which so far has not been the subject of study from other works. The level generation process was divided in two steps, planning and viusal generation, respectively responsible for generating a compact representation of the level and determining its view. The planning stage was divided in game design and level design, and uses a goaloriented process to output a set of rooms. The visual generation step receives a set of rooms and fills its interior with the appropriate parts of previously authored geometry
Resumo:
A automação consiste em uma importante atividade do processo de teste e é capaz de reduzir significativamente o tempo e custo do desenvolvimento. Algumas ferramentas tem sido propostas para automatizar a realização de testes de aceitação em aplicações Web. Contudo, grande parte delas apresenta limitações importantes tais como necessidade de valoração manual dos casos de testes, refatoração do código gerado e forte dependência com a estrutura das páginas HTML. Neste trabalho, apresentamos uma linguagem de especificação de teste e uma ferramenta concebidas para minimizar os impactos propiciados por essas limitações. A linguagem proposta dá suporte aos critérios de classes de equivalência e a ferramenta, desenvolvida sob a forma de um plug-in para a plataforma Eclipse, permite a geração de casos de teste através de diferentes estratégias de combinação. Para realizar a avaliação da abordagem, utilizamos um dos módulos do Sistema Unificado de Administração Publica (SUAP) do Instituto Federal do Rio Grande do Norte (IFRN). Participaram da avaliação analistas de sistemas e um técnico de informática que atuam como desenvolvedores do sistema utilizado.
Resumo:
The Exception Handling (EH) is a widely used mechanism for building robust systems. In Software Product Line (SPL) context it is not different. As EH mechanisms are embedded in most of mainstream programming languages (like Java, C# and C++), we can find exception signalers and handlers spread over code assets associated to common and variable SPL features. When exception signalers and handlers are added to an SPL in an unplanned way, one of the possible consequences is the generation of faulty family instances (i.e., instances on which common or variable features signal exceptions that are mistakenly caught inside the system). In this context, some questions arise: How exceptions flow between the optional and alternative features an LPS? Aiming at providing answers to these questions, this master thesis conducted an exploratory study, based on code inspection and static analysis code, whose goal was to categorize the main ways which exceptions flow in LPSs. To support the study, we developed an static analysis tool called PLEA (Product Line Exception Analyzer) that calculates the exceptional flows of LPSs, and categorize these flows according to the features associated with handlers and signalers. Preliminary results showed that some types of exceptional flows have more potential to yield failures in exceptional behavior of SLPs
Resumo:
Typically Web services contain only syntactic information that describes their interfaces. Due to the lack of semantic descriptions of the Web services, service composition becomes a difficult task. To solve this problem, Web services can exploit the use of ontologies for the semantic definition of service s interface, thus facilitating the automation of discovering, publication, mediation, invocation, and composition of services. However, ontology languages, such as OWL-S, have constructs that are not easy to understand, even for Web developers, and the existing tools that support their use contains many details that make them difficult to manipulate. This paper presents a MDD tool called AutoWebS (Automatic Generation of Semantic Web Services) to develop OWL-S semantic Web services. AutoWebS uses an approach based on UML profiles and model transformations for automatic generation of Web services and their semantic description. AutoWebS offers an environment that provides many features required to model, implement, compile, and deploy semantic Web services
Resumo:
One way to deal with the high complexity of current software systems is through selfadaptive systems. Self-adaptive system must be able to monitor themselves and their environment, analyzing the monitored data to determine the need for adaptation, decide how the adaptation will be performed, and finally, make the necessary adjustments. One way to perform the adaptation of a system is generating, at runtime, the process that will perform the adaptation. One advantage of this approach is the possibility to take into account features that can only be evaluated at runtime, such as the emergence of new components that allow new architectural arrangements which were not foreseen at design time. In this work we have as main objective the use of a framework for dynamic generation of processes to generate architectural adaptation plans on OSGi environment. Our main interest is evaluate how this framework for dynamic generation of processes behave in new environments
Resumo:
The work proposed by Cleverton Hentz (2010) presented an approach to define tests from the formal description of a program s input. Since some programs, such as compilers, may have their inputs formalized through grammars, it is common to use context-free grammars to specify the set of its valid entries. In the original work the author developed a tool that automatically generates tests for compilers. In the present work we identify types of problems in various areas where grammars are used to describe them , for example, to specify software configurations, which are potential situations to use LGen. In addition, we conducted case studies with grammars of different domains and from these studies it was possible to evaluate the behavior and performance of LGen during the generation of sentences, evaluating aspects such as execution time, number of generated sentences and satisfaction of coverage criteria available in LGen
Resumo:
Web services are computational solutions designed according to the principles of Service Oriented Computing. Web services can be built upon pre-existing services available on the Internet by using composition languages. We propose a method to generate WS-BPEL processes from abstract specifications provided with high-level control-flow information. The proposed method allows the composition designer to concentrate on high-level specifi- cations, in order to increase productivity and generate specifications that are independent of specific web services. We consider service orchestrations, that is compositions where a central process coordinates all the operations of the application. The process of generating compositions is based on a rule rewriting algorithm, which has been extended to support basic control-flow information.We created a prototype of the extended refinement method and performed experiments over simple case studies
Resumo:
The aim of this work was to describe the methodological procedures that were mandatory to develop a 3D digital imaging of the external and internal geometry of the analogue outcrops from reservoirs and to build a Virtual Outcrop Model (VOM). The imaging process of the external geometry was acquired by using the Laser Scanner, the Geodesic GPS and the Total Station procedures. On the other hand, the imaging of the internal geometry was evaluated by GPR (Ground Penetrating Radar).The produced VOMs were adapted with much more detailed data with addition of the geological data and the gamma ray and permeability profiles. As a model for the use of the methodological procedures used on this work, the adapted VOM, two outcrops, located at the east part of the Parnaiba Basin, were selected. On the first one, rocks from the aeolian deposit of the Piaui Formation (Neo-carboniferous) and tidal flat deposits from the Pedra de Fogo Formation (Permian), which arises in a large outcrops located between Floriano and Teresina (Piauí), are present. The second area, located at the National Park of Sete Cidades, also at the Piauí, presents rocks from the Cabeças Formation deposited in fluvial-deltaic systems during the Late Devonian. From the data of the adapted VOMs it was possible to identify lines, surfaces and 3D geometry, and therefore, quantify the geometry of interest. Among the found parameterization values, a table containing the thickness and width, obtained in canal and lobes deposits at the outcrop Paredão and Biblioteca were the more relevant ones. In fact, this table can be used as an input for stochastic simulation of reservoirs. An example of the direct use of such table and their predicted radargrams was the identification of the bounding surface at the aeolian sites from the Piauí Formation. In spite of such radargrams supply only bi-dimensional data, the acquired lines followed of a mesh profile were used to add a third dimension to the imaging of the internal geometry. This phenomenon appears to be valid for all studied outcrops. As a conclusion, the tool here presented can became a new methodology in which the advantages of the digital imaging acquired from the Laser Scanner (precision, accuracy and speed of acquisition) were combined with the Total Station procedure (precision) using the classical digital photomosaic technique
Resumo:
The study of solar neutrinos is very important to a better comprehension of the set of nuclear reactions that occurs inside the Sun and in solar type stars. The ux of neutrinos provides a better comprehension of the stellar structure as a whole. In this dissertation we study the ux of neutrinos in a solar model, addressing the neutrino oscillation, analyzing with the intention of determining and verify the distribution from a statistical point of view, since this ux depends on the particles intrinsic velocity distributions in stellar plasma. The main tool for this analysis was the Toulouse-Geneva Stellar Evolution Code, or TGEC, which allow us to obtain the neutrino ux values per reaction and per layer inside the Sun, allowing us to compare the observational results for the neutrino ux detected on experiments based on Cl37 (Homestake), Ga71 (SAGE, Gallex/GNO) and water (SNO). Our results show the nal distribution for neutrino ux as a function of the depth using the coordinates of mass and radius. The dissertation also shows that the equations for this ux are present in TGEC.
Resumo:
Nowadays, as well as in the past decades, the dumping of biodegradable organic waste in landfill is common practice in Brazil, as well as in most parts of the world. Nevertheless due to its rapid decomposition and release of odors, this practice hamper’s the operation and implementation of a recycling system. These facts encouraged our research to find an efficient system for the management of organic waste, not only for the use of official workers responsible for managing these wastes, but also for non-governmental institutions. The Recycling for Life Community Association – ACREVI (Associação Comunitária Reciclando para a Vida), together with the municipal authorities of Mossoró-RN, Brazil, have assumed the social role of collecting and recycling solid waste produced by most of the local population. However, it was observed that the organic waste it collected was not receiving any treatment. This present work aims to make compost with mixed waste (green waste and organic household), and then do chemical analysis of the material in view to use the waste as organic fertilizer. The objective being: to share the knowledge acquired by putting it into a very simple language accessible to people with little education. The experiment was conducted at ACREVI, Mossoró (RN), and the compost was obtained following the method "windrow", forming three cells (I, II, III) with conical shape, dimensions of 1.6 meters and 2.0 meters in diameter for cells I and II, and 1.0 meters high and 2.0 meters in diameter for cell III. The process was accompanied by analysis: CHN elemental, a variation of cell temperature, humidity, pH, TKN, bulk density, nutrients and heavy metals. Stabilized organic compounds reached the C/N ratio of 10.4/1 cell I and 10.4/1 in the cell II in the cell, showing how good soil conditions, with potential to improve the physical properties of any soil and pH acid soils, has presented the cell III at the end of the process the C/N 26/1, is a high ratio may be associated with the stack size III, thus changing the optimal conditions for the occurrence of the process. The levels of heavy metals in the analyzed compounds were lower than those established by the SDA normative instruction, Nº 27, of 5 June, 2006. The use of pruning trees and grass are used in small-scale composting, while generating a quality compost in the final process, it also created an important condition for a correct sizing of the composting piles. Under the studied conditions it is not advisable to use cells with a height of 1.00 m in height and 2.00 m in diameter, as these do not prevent the rapid dissipation of heat and thus can not be a good product at the end of composting. The composting process in the shed of the association and the preparation of the primer enabled the development of an alternative technology to generate income for members of ACREVI.
Resumo:
This project was developed as a partnership between the Laboratory of Stratigraphical Analyses of the Geology Department of UFRN and the company Millennium Inorganic Chemicals Mineração Ltda. This company is located in the north end of the paraiban coast, in the municipal district of Mataraca. Millennium has as main prospected product, heavy minerals as ilmenita, rutilo and zircon presents in the sands of the dunes. These dunes are predominantly inactive, and overlap the superior portion of Barreiras Formation rocks. The mining happens with the use of a dredge that is emerged at an artificial lake on the dunes. This dredge removes sand dunes of the bottom lake (after it disassembles of the lake borders with water jets) and directs for the concentration plant, through piping where the minerals are then separate. The present work consisted in the acquisition external geometries of the dunes, where in the end a 3D Static Model could be set up of these sedimentary deposits with emphasis in the behavior of the structural top of Barreiras Formation rocks (inferior limit of the deposit). The knowledge of this surface is important in the phase of the plowing planning for the company, because a calculation mistake can do with that the dredge works too close of this limit, taking the risk that fragments can cause obstruction in the dredge generating a financial damage so much in the equipment repair as for the stopped days production. During the field stages (accomplished in 2006 and 2007) topographical techniques risings were used with Total Station and Geodesic GPS as well as shallow geophysical acquisitions with GPR (Ground Penetrating Radar). It was acquired almost 10,4km of topography and 10km of profiles GPR. The Geodesic GPS was used for the data geopositioning and topographical rising of a traverse line with 630m of extension in the stage of 2007. The GPR was shown a reliable method, ecologically clean, fast acquisition and with a low cost in relation to traditional methods as surveys. The main advantage of this equipment is obtain a continuous information to superior surface Barreiras Formation rocks. The static models 3D were elaborated starting from the obtained data being used two specific softwares for visualization 3D: GoCAD 2.0.8 and Datamine. The visualization 3D allows a better understanding of the Barreiras surface behavior as well as it makes possible the execution of several types of measurements, favoring like calculations and allowing that procedures used for mineral extraction is used with larger safety
Resumo:
The study area is located in the northern coast of Rio Grande do Norte State comprising the mouth of Açu-Piranhas river including the cities of Porto do Mangue e Areia Branca. The local geological setting comprises Cretaceous, Tertiary and Quaternary geological units of the Potiguar Basin. One is about a region of high morphologic instability due to action of the rigorous dynamic coastal processes, beyond the intense human activities mainly for the performance of the petroliferous industry, salt farms and tanks of shrimp industry.For the accomplishment of this work Landsat 5 TM and Landsat 7 ETM + from four distinct dates were used as cartographic base, in which one applied techniques of digital processing to elaborate thematic maps of the existing natural resources to support the geologic and geomorphologic characterization and the soil and landuse maps. The strategy applied was the interpretation of multitemporal images from aerial and orbital remote sensors alIied to the terrain truth recognition, integrated through a Geographic Information System. These activities had alIowed the production of Sensitivity Maps of the Coast to Oil Spilling for the area, on the basis of the Coastal Sensibility Index. Taking into account the seasons were created maps to distinct datas: July 2003 represents the winter months that presented a sensibility lower when compared with the month of December 2003. For the summer months greater sensitivity is due to the hydrodynamic data that suggest a lesser capacity of natural cleanness of the oil and its derivatives in spilling case.These outcomes are an important and useful database to support an assessment to a risk situation and to taking decision in the face of an environmental disaster with oil spilling in coastal area, alIowing a complete visualization of the area and identifying all portions in the area with thei environmental units and respective Coastal Sensibility Index.
Resumo:
The northern portion of the Rio Grande do Norte State is characterized by intense coastal dynamics affecting areas with ecosystems of moderate to high environmental sensitivity. In this region are installed the main socioeconomic activities of RN State: salt industry, shrimp farm, fruit industry and oil industry. The oil industry suffers the effects of coastal dynamic action promoting problems such as erosion and exposure of wells and pipelines along the shore. Thus came the improvement of such modifications, in search of understanding of the changes which causes environmental impacts with the purpose of detecting and assessing areas with greater vulnerability to variations. Coastal areas under influence oil industry are highly vulnerable and sensitive in case of accidents involving oil spill in the vicinity. Therefore, it was established the geoenvironmental monitoring of the region with the aim of evaluating the entire coastal area evolution and check the sensitivity of the site on the presence of oil. The goal of this work was the implementation of a computer system that combines the needs of insertion and visualization of thematic maps for the generation of Environmental Vulnerability maps, using techniques of Business Intelligence (BI), from vector information previously stored in the database. The fundamental design interest was to implement a more scalable system that meets the diverse fields of study and make the appropriate system for generating online vulnerability maps, automating the methodology so as to facilitate data manipulation and fast results in cases of real time operational decision-making. In database development a geographic area was established the conceptual model of the selected data and Web system was done using the template database PostgreSQL, PostGis spatial extension, Glassfish Web server and the viewer maps Web environment, the GeoServer. To develop a geographic database it was necessary to generate the conceptual model of the selected data and the Web system development was done using the PostgreSQL database system, its spatial extension PostGIS, the web server Glassfish and GeoServer to display maps in Web
Resumo:
The 3D gravity modeling of the Potiguar rift basin consisted of a digital processing of gravity and aeromagnetic data, subsidized by the results of Euler deconvolution of gravity and magnetic data and the interpretation of seismic lines and wells descriptions. The gravity database is a compilation of independent geophysical surveys conducted by several universities, research institutions and governmental agencies. The aeromagnetic data are from the Bacia Potiguar and Plataforma Continental do Nordeste projects, obtained from the Brazilian Petroleum Agency (ANP). The solutions of the Euler Deconvolution allowed the analysis of the behavior of the rift main limits. While the integrated interpretation of seismic lines provided the delimitating horizons of the sedimentary formations and the basement top. The integration of these data allowed a 3D gravity modeling of basement topography, allowing the identification of a series of internal structures of the Potiguar rift, as well intra-basement structures without the gravity effect of the rift. The proposed inversion procedure of the gravity data allowed to identify the main structural features of the Potiguar rift, elongated in the NE-SW direction, and its southern and eastern faulted edges, where the sedimentary infill reachs thicknesses up to 5500 m. The southern boundary is marked by the Apodi and Baixa Grande faults. These faults seem to be a single NW-SE oriented fault with a strong bend to NE-SW direction. In addition, the eastern boundary of the rift is conditioned by the NE-SW trending Carnaubais fault system. It was also observed NW-SE oriented faults, which acted as transfer faults to the extensional efforts during the basin formation. In the central part of the residual anomaly map without the gravity effect of the rift stands out a NW-SE trending gravity high, corresponding to the Orós-Jaguaribe belt lithotypes. We also observe a gravity maximum parallel to the Carnaubais fault system. This anomaly is aligned to the eastern limit of the rift and reflects the contact of different crustal blocks, limited by the eastern ward counterpart of the Portalegre Shear Zone