1000 resultados para Ciências Exata e da Terra


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Component-based Software Engineering (CBSE) and Service-Oriented Architecture (SOA) became popular ways to develop software over the last years. During the life-cycle of a software system, several components and services can be developed, evolved and replaced. In production environments, the replacement of core components, such as databases, is often a risky and delicate operation, where several factors and stakeholders should be considered. Service Level Agreement (SLA), according to ITILv3’s official glossary, is “an agreement between an IT service provider and a customer. The agreement consists on a set of measurable constraints that a service provider must guarantee to its customers.”. In practical terms, SLA is a document that a service provider delivers to its consumers with minimum quality of service (QoS) metrics.This work is intended to assesses and improve the use of SLAs to guide the transitioning process of databases on production environments. In particular, in this work we propose SLA-Based Guidelines/Process to support migrations from a relational database management system (RDBMS) to a NoSQL one. Our study is validated by case studies.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The spread of wireless networks and growing proliferation of mobile devices require the development of mobility control mechanisms to support the different demands of traffic in different network conditions. A major obstacle to developing this kind of technology is the complexity involved in handling all the information about the large number of Moving Objects (MO), as well as the entire signaling overhead required to manage these procedures in the network. Despite several initiatives have been proposed by the scientific community to address this issue they have not proved to be effective since they depend on the particular request of the MO that is responsible for triggering the mobility process. Moreover, they are often only guided by wireless medium statistics, such as Received Signal Strength Indicator (RSSI) of the candidate Point of Attachment (PoA). Thus, this work seeks to develop, evaluate and validate a sophisticated communication infrastructure for Wireless Networking for Moving Objects (WiNeMO) systems by making use of the flexibility provided by the Software-Defined Networking (SDN) paradigm, where network functions are easily and efficiently deployed by integrating OpenFlow and IEEE 802.21 standards. For purposes of benchmarking, the analysis was conducted in the control and data planes aspects, which demonstrate that the proposal significantly outperforms typical IPbased SDN and QoS-enabled capabilities, by allowing the network to handle the multimedia traffic with optimal Quality of Service (QoS) transport and acceptable Quality of Experience (QoE) over time.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Multi-objective problems may have many optimal solutions, which together form the Pareto optimal set. A class of heuristic algorithms for those problems, in this work called optimizers, produces approximations of this optimal set. The approximation set kept by the optmizer may be limited or unlimited. The benefit of using an unlimited archive is to guarantee that all the nondominated solutions generated in the process will be saved. However, due to the large number of solutions that can be generated, to keep an archive and compare frequently new solutions to the stored ones may demand a high computational cost. The alternative is to use a limited archive. The problem that emerges from this situation is the need of discarding nondominated solutions when the archive is full. Some techniques were proposed to handle this problem, but investigations show that none of them can surely prevent the deterioration of the archives. This work investigates a technique to be used together with the previously proposed ideas in the literature to deal with limited archives. The technique consists on keeping discarded solutions in a secondary archive, and periodically recycle these solutions, bringing them back to the optimization. Three methods of recycling are presented. In order to verify if these ideas are capable to improve the archive content during the optimization, they were implemented together with other techniques from the literature. An computational experiment with NSGA-II, SPEA2, PAES, MOEA/D and NSGA-III algorithms, applied to many classes of problems is presented. The potential and the difficulties of the proposed techniques are evaluated based on statistical tests.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Multi-objective problems may have many optimal solutions, which together form the Pareto optimal set. A class of heuristic algorithms for those problems, in this work called optimizers, produces approximations of this optimal set. The approximation set kept by the optmizer may be limited or unlimited. The benefit of using an unlimited archive is to guarantee that all the nondominated solutions generated in the process will be saved. However, due to the large number of solutions that can be generated, to keep an archive and compare frequently new solutions to the stored ones may demand a high computational cost. The alternative is to use a limited archive. The problem that emerges from this situation is the need of discarding nondominated solutions when the archive is full. Some techniques were proposed to handle this problem, but investigations show that none of them can surely prevent the deterioration of the archives. This work investigates a technique to be used together with the previously proposed ideas in the literature to deal with limited archives. The technique consists on keeping discarded solutions in a secondary archive, and periodically recycle these solutions, bringing them back to the optimization. Three methods of recycling are presented. In order to verify if these ideas are capable to improve the archive content during the optimization, they were implemented together with other techniques from the literature. An computational experiment with NSGA-II, SPEA2, PAES, MOEA/D and NSGA-III algorithms, applied to many classes of problems is presented. The potential and the difficulties of the proposed techniques are evaluated based on statistical tests.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Given the growing demand for the development of mobile applications, driven by use increasingly common in smartphones and tablets grew in society the need for remote data access in full in the use of mobile application without connectivity environments where there is no provision network access at all times. Given this reality, this work proposes a framework that present main functions are the provision of a persistence mechanism, replication and data synchronization, contemplating the creation, deletion, update and display persisted or requested data, even though the mobile device without connectivity with the network. From the point of view of the architecture and programming practices, it reflected in defining strategies for the main functions of the framework are met. Through a controlled study was to validate the solution proposal, being found as the gains in reducing the number of lines code and the amount of time required to perform the development of an application without there being significant increase for the operations.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Given the growing demand for the development of mobile applications, driven by use increasingly common in smartphones and tablets grew in society the need for remote data access in full in the use of mobile application without connectivity environments where there is no provision network access at all times. Given this reality, this work proposes a framework that present main functions are the provision of a persistence mechanism, replication and data synchronization, contemplating the creation, deletion, update and display persisted or requested data, even though the mobile device without connectivity with the network. From the point of view of the architecture and programming practices, it reflected in defining strategies for the main functions of the framework are met. Through a controlled study was to validate the solution proposal, being found as the gains in reducing the number of lines code and the amount of time required to perform the development of an application without there being significant increase for the operations.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The Quadratic Minimum Spanning Tree (QMST) problem is a generalization of the Minimum Spanning Tree problem in which, beyond linear costs associated to each edge, quadratic costs associated to each pair of edges must be considered. The quadratic costs are due to interaction costs between the edges. When interactions occur between adjacent edges only, the problem is named Adjacent Only Quadratic Minimum Spanning Tree (AQMST). Both QMST and AQMST are NP-hard and model a number of real world applications involving infrastructure networks design. Linear and quadratic costs are summed in the mono-objective versions of the problems. However, real world applications often deal with conflicting objectives. In those cases, considering linear and quadratic costs separately is more appropriate and multi-objective optimization provides a more realistic modelling. Exact and heuristic algorithms are investigated in this work for the Bi-objective Adjacent Only Quadratic Spanning Tree Problem. The following techniques are proposed: backtracking, branch-and-bound, Pareto Local Search, Greedy Randomized Adaptive Search Procedure, Simulated Annealing, NSGA-II, Transgenetic Algorithm, Particle Swarm Optimization and a hybridization of the Transgenetic Algorithm with the MOEA-D technique. Pareto compliant quality indicators are used to compare the algorithms on a set of benchmark instances proposed in literature.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The Quadratic Minimum Spanning Tree (QMST) problem is a generalization of the Minimum Spanning Tree problem in which, beyond linear costs associated to each edge, quadratic costs associated to each pair of edges must be considered. The quadratic costs are due to interaction costs between the edges. When interactions occur between adjacent edges only, the problem is named Adjacent Only Quadratic Minimum Spanning Tree (AQMST). Both QMST and AQMST are NP-hard and model a number of real world applications involving infrastructure networks design. Linear and quadratic costs are summed in the mono-objective versions of the problems. However, real world applications often deal with conflicting objectives. In those cases, considering linear and quadratic costs separately is more appropriate and multi-objective optimization provides a more realistic modelling. Exact and heuristic algorithms are investigated in this work for the Bi-objective Adjacent Only Quadratic Spanning Tree Problem. The following techniques are proposed: backtracking, branch-and-bound, Pareto Local Search, Greedy Randomized Adaptive Search Procedure, Simulated Annealing, NSGA-II, Transgenetic Algorithm, Particle Swarm Optimization and a hybridization of the Transgenetic Algorithm with the MOEA-D technique. Pareto compliant quality indicators are used to compare the algorithms on a set of benchmark instances proposed in literature.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Until recently the use of biometrics was restricted to high-security environments and criminal identification applications, for economic and technological reasons. However, in recent years, biometric authentication has become part of daily lives of people. The large scale use of biometrics has shown that users within the system may have different degrees of accuracy. Some people may have trouble authenticating, while others may be particularly vulnerable to imitation. Recent studies have investigated and identified these types of users, giving them the names of animals: Sheep, Goats, Lambs, Wolves, Doves, Chameleons, Worms and Phantoms. The aim of this study is to evaluate the existence of these users types in a database of fingerprints and propose a new way of investigating them, based on the performance of verification between subjects samples. Once introduced some basic concepts in biometrics and fingerprint, we present the biometric menagerie and how to evaluate them.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Until recently the use of biometrics was restricted to high-security environments and criminal identification applications, for economic and technological reasons. However, in recent years, biometric authentication has become part of daily lives of people. The large scale use of biometrics has shown that users within the system may have different degrees of accuracy. Some people may have trouble authenticating, while others may be particularly vulnerable to imitation. Recent studies have investigated and identified these types of users, giving them the names of animals: Sheep, Goats, Lambs, Wolves, Doves, Chameleons, Worms and Phantoms. The aim of this study is to evaluate the existence of these users types in a database of fingerprints and propose a new way of investigating them, based on the performance of verification between subjects samples. Once introduced some basic concepts in biometrics and fingerprint, we present the biometric menagerie and how to evaluate them.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Cement pastes used in cementing oil wells operations are prepared according to the specific characteristics of each well. The physical properties requested for each folder formulation depend on the temperature and pressure of the well to be cemented. The rheological properties of the pulp are important control parameter for efficiency in transportation and positioning the folder during the cementing operation. One of the main types of additive used for the adjustment of rheological properties of cement pastes is the dispersant additive. This work aims to study the influence of variation of the time of addition of the polycarboxylate (0, 5, 10 and 15 minutes) in cement pastes, considering the initial periods of hydration of cement particles as fundamental point for better performance dispersant additive. Pastes were prepared with a density set at 15.6 lb/gal (1.87 g/cm3) and polycarboxylate concentrations ranging from 0.01 gpc to 0.05 gpc circulation temperature (BHCT) of 51°C and static temperature (BHST) of 76 C. The pastes were characterized from a rheological measurements, volume filtered, thickening time and resistance to compression formulations. Also were carried out tests Diffraction X-ray (XRD) and Scanning Electron Microscopy (MEV). The results showed that the addition of policaboxilato after 15 minutes decreased by 70% the values of rheological parameters. According to results of DRX and MEV, the addition of dispersant after 15 minutes did not affect the chemical reactions and subsequent formation of cement hydration products. A study of the economic feasibility to realize the financial benefits of the technique, which can be seen only with the use of the technique in this work to reduce the cost of production of cement paste was carried out, can get up to $ 1015.00 for each folder 100 barrels produced with said formulations.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Cement pastes used in cementing oil wells operations are prepared according to the specific characteristics of each well. The physical properties requested for each folder formulation depend on the temperature and pressure of the well to be cemented. The rheological properties of the pulp are important control parameter for efficiency in transportation and positioning the folder during the cementing operation. One of the main types of additive used for the adjustment of rheological properties of cement pastes is the dispersant additive. This work aims to study the influence of variation of the time of addition of the polycarboxylate (0, 5, 10 and 15 minutes) in cement pastes, considering the initial periods of hydration of cement particles as fundamental point for better performance dispersant additive. Pastes were prepared with a density set at 15.6 lb/gal (1.87 g/cm3) and polycarboxylate concentrations ranging from 0.01 gpc to 0.05 gpc circulation temperature (BHCT) of 51°C and static temperature (BHST) of 76 C. The pastes were characterized from a rheological measurements, volume filtered, thickening time and resistance to compression formulations. Also were carried out tests Diffraction X-ray (XRD) and Scanning Electron Microscopy (MEV). The results showed that the addition of policaboxilato after 15 minutes decreased by 70% the values of rheological parameters. According to results of DRX and MEV, the addition of dispersant after 15 minutes did not affect the chemical reactions and subsequent formation of cement hydration products. A study of the economic feasibility to realize the financial benefits of the technique, which can be seen only with the use of the technique in this work to reduce the cost of production of cement paste was carried out, can get up to $ 1015.00 for each folder 100 barrels produced with said formulations.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The advance of drilling in deeper wells has required more thermostable materials. The use of synthetic fluids, which usually have a good chemical stability, faces the environmental constraints, besides it usually generate more discharge and require a costly disposal treatment of drilled cuttings, which are often not efficient and require mechanical components that hinder the operation. The adoption of aqueous fluids generally involves the use of chrome lignosulfonate, used as dispersant, which provides stability on rheological properties and fluid loss under high temperatures and pressures (HTHP). However, due to the environmental impact associated with the use of chrome compounds, the drilling industry needs alternatives that maintain the integrity of the property and ensure success of the operation in view of the strong influence of temperature on the viscosity of aqueous fluids and polymers used in these type fluids, often polysaccharides, passives of hydrolysis and biological degradation. Therefore, vinyl polymers were selected for this study because they have predominantly carbon chain and, in particular, polyvinylpyrrolidone (PVP) for resisting higher temperatures and partially hydrolyzed polyacrylamide (PHPA) and clay by increasing the system's viscosity. Moreover, the absence of acetal bonds reduces the sensitivity to attacks by bacteria. In order to develop an aqueous drilling fluid system for HTHP applications using PVP, HPAM and clay, as main constituents, fluid formulations were prepared and determined its rheological properties using rotary viscometer of the Fann, and volume filtrate obtained by filtration HTHP following the standard API 13B-2. The new fluid system using polyvinylpyrrolidone (PVP) with high molar weight had higher viscosities, gels and yield strength, due to the effect of flocculating clay. On the other hand, the low molecular weight PVP contributed to the formation of disperse systems with lower values in the rheological properties and fluid loss. Both systems are characterized by thermal stability gain up to around 120 ° C, keeping stable rheological parameters. The results were further corroborated through linear clay swelling tests.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The advance of drilling in deeper wells has required more thermostable materials. The use of synthetic fluids, which usually have a good chemical stability, faces the environmental constraints, besides it usually generate more discharge and require a costly disposal treatment of drilled cuttings, which are often not efficient and require mechanical components that hinder the operation. The adoption of aqueous fluids generally involves the use of chrome lignosulfonate, used as dispersant, which provides stability on rheological properties and fluid loss under high temperatures and pressures (HTHP). However, due to the environmental impact associated with the use of chrome compounds, the drilling industry needs alternatives that maintain the integrity of the property and ensure success of the operation in view of the strong influence of temperature on the viscosity of aqueous fluids and polymers used in these type fluids, often polysaccharides, passives of hydrolysis and biological degradation. Therefore, vinyl polymers were selected for this study because they have predominantly carbon chain and, in particular, polyvinylpyrrolidone (PVP) for resisting higher temperatures and partially hydrolyzed polyacrylamide (PHPA) and clay by increasing the system's viscosity. Moreover, the absence of acetal bonds reduces the sensitivity to attacks by bacteria. In order to develop an aqueous drilling fluid system for HTHP applications using PVP, HPAM and clay, as main constituents, fluid formulations were prepared and determined its rheological properties using rotary viscometer of the Fann, and volume filtrate obtained by filtration HTHP following the standard API 13B-2. The new fluid system using polyvinylpyrrolidone (PVP) with high molar weight had higher viscosities, gels and yield strength, due to the effect of flocculating clay. On the other hand, the low molecular weight PVP contributed to the formation of disperse systems with lower values in the rheological properties and fluid loss. Both systems are characterized by thermal stability gain up to around 120 ° C, keeping stable rheological parameters. The results were further corroborated through linear clay swelling tests.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The discussion about rift evolution in the Brazilian Equatorial margin during the South America-Africa breakup in the Jurassic/Cretaceous has been focused in many researches. But rift evolution based on development and growth of faults has not been well explored. In this sense, we investigated the Cretaceous Potiguar Basin in the Equatorial margin of Brazil to understand the geometry of major faults and the influence of crustal heterogeneity and preexisting structural fabric in the evolution of the basin internal architecture. Previous studies pointed out that the rift is an asymmetrical half-graben elongated along the NE-SW direction. We used 2D seismic, well logs and 3D gravity modeling to analyze four major border fault segments and determine their maximum displacement (Dmax) and length (L) ratio in the Potiguar Rift. We constrained the 3D gravity modeling with well data and the interpretation of seismic sections. The difference of the fault displacement measured in the gravity model is in the order of 10% compared to seismic and well data. The fault-growth curves allowed us to divide the faulted rift border into four main fault segments, which provide roughly similar Dmax/L ratios. Fault-growth curves suggest that a regional uniform tectonic mechanism influenced growth of the rift fault segments. The variation of the displacements along the fault segments indicates that the fault segments were formed independently during rift initiation and were linked by hard and soft linkages. The latter formed relay ramps. In the interconnection zones the Dmax/L ratios are highest due to interference of fault segment motions. We divided the evolution of the Potiguar Rift into five stages based on these ratios and correlated them with the major tectonic stages of the breakup between South America and Africa in Early Cretaceous.