84 resultados para Softwares dinâmicos
Resumo:
Mainstream programming languages provide built-in exception handling mechanisms to support robust and maintainable implementation of exception handling in software systems. Most of these modern languages, such as C#, Ruby, Python and many others, are often claimed to have more appropriated exception handling mechanisms. They reduce programming constraints on exception handling to favor agile changes in the source code. These languages provide what we call maintenance-driven exception handling mechanisms. It is expected that the adoption of these mechanisms improve software maintainability without hindering software robustness. However, there is still little empirical knowledge about the impact that adopting these mechanisms have on software robustness. This work addresses this gap by conducting an empirical study aimed at understanding the relationship between changes in C# programs and their robustness. In particular, we evaluated how changes in the normal and exceptional code were related to exception handling faults. We applied a change impact analysis and a control flow analysis in 100 versions of 16 C# programs. The results showed that: (i) most of the problems hindering software robustness in those programs are caused by changes in the normal code, (ii) many potential faults were introduced even when improving exception handling in C# code, and (iii) faults are often facilitated by the maintenance-driven flexibility of the exception handling mechanism. Moreover, we present a series of change scenarios that decrease the program robustness
Resumo:
This work presents an User Interface (UI) prototypes generation process to the softwares that has a Web browser as a plataform. This process uses UI components more complex than HTML elements. To described this components more complex this work suggest to use the XICL (eXtensinble User Interface Components Language). XICL is a language, based on XML syntax, to describe UI Components and IUs. XICL promotes extensibility and reusability in the User Interface development process. We have developed two compiler. The first one compiles IMML (Interactive Message Modeling Language) code and generates XICL code. The second one compiles XICL code and generates DHTML code
Resumo:
This study aims to analyze tourist information provided by the official websites of the 2014 FIFA World Cup host cities. The framework developed by Díaz (2005) was applied to analyze different aspects, such as: local tourist information, tourist services distribution, communication and interaction between website and users, and website foreign language versions. This dissertation describes how society and tourism are related by analyzing the consequences of technological evolution in the travel and tourism sector, showing the importance of the use of information and communication technology to provide accurate, upto- date and low-cost information to tourist destinations. Because of the nature of the study, the research subjects are the 12 Brazilian host cities represented by their respective official webpages (cities, states and convention bureaus), and also Brazil s official website, totalizing 36 elements to be analyzed. The methodology has been characterized as descriptive and exploratory with quantitative analysis, and also using desk research and survey literature review. In order to analyze the data collected, parametric and nonparametric statistics tests were used, such as: variance analysis (ANOVA and KRUSKAL-WALLIS) to measure means variance between groups combined with multiple comparison tests (Tukey and Games Howell); nonparametric correlations tests (Kendall s Tau b); and cluster analyses. Finally, Microsoft Excel was used to collect data and SPSS for managing data through quantitative analyses tests. Overall, the websites of the south region showed better results than the other Brazilian regions. Despite this result, the data analysis demonstrated that the available tourist information are incomplete as it was verified that tourist host cities websites are unable to provide all the information needed for the web visitors to organize and plan their journey. This means that visitors have to look for more information in other sources
Resumo:
This study has as main objective to verify the proportionality of the costs with elaboration of didactic material and costs with tutelage and instruction incurred in a higher education institution, located in national territory, and that offers a degree course in the education area in the modality of EaD online. The purpose of the measurement of this proportionality of the costs was of answering these they are, among other, the relevant costs incurred by the institution research object. This is a research of exploratory stamp whose methodological procedure adopted for your development, in what it refers the collection, analysis of the data and the investigation means, it is the case study, as well as the documental and bibliographical research. The found results indicate that the costs with elaboration of didactic material and the costs with tutelage and instruction are relevant, however they are not the only ones considered as such. The costs involved with acquisition and operation of specific softwares they also constitute an relevant costs. The structure of the costs can be altered in agreement with the analyzed period and in agreement with the characteristics of synchronism of the offered course. It was also verified that the technology used in this education modality it ends up generating additional costs incurred with professionals of specific knowledge in technology
Resumo:
The mangrove is a coastal ecosystem of the big ecological importance, showing high fragility front by natural process and the human interventions in the coastal zone. This research has objective to analyses the relation between mangrove species distribution and geochemical parameters variation of the water and soil in Apodi/Mossoro estuary, located in the Rio Grande do Norte state north coastline. The results were obtained from floristic and structural analysis of the vegetation and Quick Bird satellite images interpretation (collected in 2006 year), manipulated with ENVI 4.3 and ArcGIS 9.2 software s. This estuary was characterized by to presents a gradient of the salinity around 40 kilometers extension, finding amount between 50 and 90 g/l-1. Will be identified the formation of the mix vegetation formation in the estuary mount, where the water salinity no show express wide variation on seawater (36 g/l-1), finding species: Rhizophora mangle L., Laguncularia racemosa (L.) C. F. Gaertn, Avicennia schaueriana Stap. & Leechman e Avicennia germinans L. Along of the estuary, have a streak formation of the vegetation composed by Avicennia spp. and L. racemosa. In high estuary, where the salinities value stay above 60 g/l-1, only A. germinans predominate in dwarf form. In this sense, the salinity is as a limiting factor of stress on the mangrove vegetation as it enters the estuary, this parameter should be taken into account when drawing up management plans and environmental restoration in the estuary in question
Resumo:
In this beginning of the XXI century, the Geology moves for new ways that demand a capacity to work with different information and new tools. It is within this context that the analog characterization has important in the prediction and understanding the lateral changes in the geometry and facies distribution. In the present work was developed a methodology for integration the geological and geophysical data in transitional recent deposits, the modeling of petroliferous reservoirs, the volume calculation and the uncertainties associate with this volume. For this purpose it was carried planialtimetric and geophysics (Ground Penetrating Radar) surveys in three areas of the Parnaíba River. With this information, it was possible to visualize the overlap of different estuary channels and make the delimitation of the channel geometry (width and thickness). For three-dimensional visualization and modeling were used two of the main reservoirs modeling software. These studies were performed with the collected parameters and the data of two reservoirs. The first was created with the Potiguar Basin wells data existents in the literature and corresponding to Açu IV unit. In the second case was used a real database of the Northern Sea. In the procedures of reservoirs modeling different workflows were created and generated five study cases with their volume calculation. Afterwards an analysis was realized to quantify the uncertainties in the geological modeling and their influence in the volume. This analysis was oriented to test the generating see and the analogous data use in the model construction
Resumo:
Difusive processes are extremely common in Nature. Many complex systems, such as microbial colonies, colloidal aggregates, difusion of fluids, and migration of populations, involve a large number of similar units that form fractal structures. A new model of difusive agregation was proposed recently by Filoche and Sapoval [68]. Based on their work, we develop a model called Difusion with Aggregation and Spontaneous Reorganization . This model consists of a set of particles with excluded volume interactions, which perform random walks on a square lattice. Initially, the lattice is occupied with a density p = N/L2 of particles occupying distinct, randomly chosen positions. One of the particles is selected at random as the active particle. This particle executes a random walk until it visits a site occupied by another particle, j. When this happens, the active particle is rejected back to its previous position (neighboring particle j), and a new active particle is selected at random from the set of N particles. Following an initial transient, the system attains a stationary regime. In this work we study the stationary regime, focusing on scaling properties of the particle distribution, as characterized by the pair correlation function ø(r). The latter is calculated by averaging over a long sequence of configurations generated in the stationary regime, using systems of size 50, 75, 100, 150, . . . , 700. The pair correlation function exhibits distinct behaviors in three diferent density ranges, which we term subcritical, critical, and supercritical. We show that in the subcritical regime, the particle distribution is characterized by a fractal dimension. We also analyze the decay of temporal correlations
Resumo:
This project was developed as a partnership between the Laboratory of Stratigraphical Analyses of the Geology Department of UFRN and the company Millennium Inorganic Chemicals Mineração Ltda. This company is located in the north end of the paraiban coast, in the municipal district of Mataraca. Millennium has as main prospected product, heavy minerals as ilmenita, rutilo and zircon presents in the sands of the dunes. These dunes are predominantly inactive, and overlap the superior portion of Barreiras Formation rocks. The mining happens with the use of a dredge that is emerged at an artificial lake on the dunes. This dredge removes sand dunes of the bottom lake (after it disassembles of the lake borders with water jets) and directs for the concentration plant, through piping where the minerals are then separate. The present work consisted in the acquisition external geometries of the dunes, where in the end a 3D Static Model could be set up of these sedimentary deposits with emphasis in the behavior of the structural top of Barreiras Formation rocks (inferior limit of the deposit). The knowledge of this surface is important in the phase of the plowing planning for the company, because a calculation mistake can do with that the dredge works too close of this limit, taking the risk that fragments can cause obstruction in the dredge generating a financial damage so much in the equipment repair as for the stopped days production. During the field stages (accomplished in 2006 and 2007) topographical techniques risings were used with Total Station and Geodesic GPS as well as shallow geophysical acquisitions with GPR (Ground Penetrating Radar). It was acquired almost 10,4km of topography and 10km of profiles GPR. The Geodesic GPS was used for the data geopositioning and topographical rising of a traverse line with 630m of extension in the stage of 2007. The GPR was shown a reliable method, ecologically clean, fast acquisition and with a low cost in relation to traditional methods as surveys. The main advantage of this equipment is obtain a continuous information to superior surface Barreiras Formation rocks. The static models 3D were elaborated starting from the obtained data being used two specific softwares for visualization 3D: GoCAD 2.0.8 and Datamine. The visualization 3D allows a better understanding of the Barreiras surface behavior as well as it makes possible the execution of several types of measurements, favoring like calculations and allowing that procedures used for mineral extraction is used with larger safety
Resumo:
The history match procedure in an oil reservoir is of paramount importance in order to obtain a characterization of the reservoir parameters (statics and dynamics) that implicates in a predict production more perfected. Throughout this process one can find reservoir model parameters which are able to reproduce the behaviour of a real reservoir.Thus, this reservoir model may be used to predict production and can aid the oil file management. During the history match procedure the reservoir model parameters are modified and for every new set of reservoir model parameters found, a fluid flow simulation is performed so that it is possible to evaluate weather or not this new set of parameters reproduces the observations in the actual reservoir. The reservoir is said to be matched when the discrepancies between the model predictions and the observations of the real reservoir are below a certain tolerance. The determination of the model parameters via history matching requires the minimisation of an objective function (difference between the observed and simulated productions according to a chosen norm) in a parameter space populated by many local minima. In other words, more than one set of reservoir model parameters fits the observation. With respect to the non-uniqueness of the solution, the inverse problem associated to history match is ill-posed. In order to reduce this ambiguity, it is necessary to incorporate a priori information and constraints in the model reservoir parameters to be determined. In this dissertation, the regularization of the inverse problem associated to the history match was performed via the introduction of a smoothness constraint in the following parameter: permeability and porosity. This constraint has geological bias of asserting that these two properties smoothly vary in space. In this sense, it is necessary to find the right relative weight of this constrain in the objective function that stabilizes the inversion and yet, introduces minimum bias. A sequential search method called COMPLEX was used to find the reservoir model parameters that best reproduce the observations of a semi-synthetic model. This method does not require the usage of derivatives when searching for the minimum of the objective function. Here, it is shown that the judicious introduction of the smoothness constraint in the objective function formulation reduces the associated ambiguity and introduces minimum bias in the estimates of permeability and porosity of the semi-synthetic reservoir model