823 resultados para Centro de Ciências Exatas e da Engenharia
Resumo:
In this thesis used four different methods in order to diagnose the precipitation extremes on Northeastern Brazil (NEB): Generalized Linear Model s via logistic regression and Poisson, extreme value theory analysis via generalized extre me value (GEV) and generalized Pareto (GPD) distributions and Vectorial Generalized Linea r Models via GEV (MVLG GEV). The logistic regression and Poisson models were used to identify the interactions between the precipitation extremes and other variables based on the odds ratios and relative risks. It was found that the outgoing longwave radiation was the indicator variable for the occurrence of extreme precipitation on eastern, northern and semi arid NEB, and the relative humidity was verified on southern NEB. The GEV and GPD distribut ions (based on the 95th percentile) showed that the location and scale parameters were presented the maximum on the eastern and northern coast NEB, the GEV verified a maximum core on western of Pernambuco influenced by weather systems and topography. The GEV and GPD shape parameter, for most regions the data fitted by Weibull negative an d Beta distributions (ξ < 0) , respectively. The levels and return periods of GEV (GPD) on north ern Maranhão (centerrn of Bahia) may occur at least an extreme precipitation event excee ding over of 160.9 mm /day (192.3 mm / day) on next 30 years. The MVLG GEV model found tha t the zonal and meridional wind components, evaporation and Atlantic and Pacific se a surface temperature boost the precipitation extremes. The GEV parameters show the following results: a) location ( ), the highest value was 88.26 ± 6.42 mm on northern Maran hão; b) scale ( σ ), most regions showed positive values, except on southern of Maranhão; an d c) shape ( ξ ), most of the selected regions were adjusted by the Weibull negative distr ibution ( ξ < 0 ). The southern Maranhão and southern Bahia have greater accuracy. The level period, it was estimated that the centern of Bahia may occur at least an extreme precipitatio n event equal to or exceeding over 571.2 mm/day on next 30 years.
Resumo:
This thesis presents a certification method for semantic web services compositions which aims to statically ensure its functional correctness. Certification method encompasses two dimensions of verification, termed base and functional dimensions. Base dimension concerns with the verification of application correctness of the semantic web service in the composition, i.e., to ensure that each service invocation given in the composition comply with its respective service definition. The certification of this dimension exploits the semantic compatibility between the invocation arguments and formal parameters of the semantic web service. Functional dimension aims to ensure that the composition satisfies a given specification expressed in the form of preconditions and postconditions. This dimension is formalized by a Hoare logic based calculus. Partial correctness specifications involving compositions of semantic web services can be derived from the deductive system proposed. Our work is also characterized by exploiting the use of a fragment of description logic, i.e., ALC, to express the partial correctness specifications. In order to operationalize the proposed certification method, we developed a supporting environment for defining the semantic web services compositions as well as to conduct the certification process. The certification method were experimentally evaluated by applying it in three different proof concepts. These proof concepts enabled to broadly evaluate the method certification
Resumo:
The correct distance perception is important for executing various interactive tasks such as navigation, selection and manipulation. It is known, however, that, in general, there is a significant distance perception compression in virtual environments, mainly when using Head-Mounted Displays - HMDs. This perceived distance compression may bring various problems to the applications and even affect in a negative way the utility of those applications that depends on the correct judgment of distances. The scientific community, so far, have not been able to determine the causes of the distance perception compression in virtual environments. For this reason, it was the objective of this work to investigate, through experiments with users, the influence of both the field-of-view - FoV - and the distance estimation methods on this perceived compression. For that, an experimental comparison between the my3D device and a HMD, using 32 participants, seeking to find information on the causes of the compressed perception, was executed. The results showed that the my3D has inferior capabilities when compared to the HMD, resulting in worst estimations, on average, in both the tested estimation methods. The causes of that are believed to be the incorrect stimulus of the peripheral vision of the user, the smaller FoV and the smaller immersion sense, as described by the participants of the experiment.
Resumo:
Data Visualization is widely used to facilitate the comprehension of information and find relationships between data. One of the most widely used techniques for multivariate data (4 or more variables) visualization is the 2D scatterplot. This technique associates each data item to a visual mark in the following way: two variables are mapped to Cartesian coordinates so that a visual mark can be placed on the Cartesian plane; the others variables are mapped gradually to visual properties of the mark, such as size, color, shape, among others. As the number of variables to be visualized increases, the amount of visual properties associated to the mark increases as well. As a result, the complexity of the final visualization is higher. However, increasing the complexity of the visualization does not necessarily implies a better visualization and, sometimes, it provides an inverse situation, producing a visually polluted and confusing visualization—this problem is called visual properties overload. This work aims to investigate whether it is possible to work around the overload of the visual channel and improve insight about multivariate data visualized through a modification in the 2D scatterplot technique. In this modification, we map the variables from data items to multisensoriy marks. These marks are composed not only by visual properties, but haptic properties, such as vibration, viscosity and elastic resistance, as well. We believed that this approach could ease the insight process, through the transposition of properties from the visual channel to the haptic channel. The hypothesis was verified through experiments, in which we have analyzed (a) the accuracy of the answers; (b) response time; and (c) the grade of personal satisfaction with the proposed approach. However, the hypothesis was not validated. The results suggest that there is an equivalence between the investigated visual and haptic properties in all analyzed aspects, though in strictly numeric terms the multisensory visualization achieved better results in response time and personal satisfaction.
Resumo:
Through numerous technological advances in recent years along with the popularization of computer devices, the company is moving towards a paradigm “always connected”. Computer networks are everywhere and the advent of IPv6 paves the way for the explosion of the Internet of Things. This concept enables the sharing of data between computing machines and objects of day-to-day. One of the areas placed under Internet of Things are the Vehicular Networks. However, the information generated individually for a vehicle has no large amount and does not contribute to an improvement in transit, once information has been isolated. This proposal presents the Infostructure, a system that has to facilitate the efforts and reduce costs for development of applications context-aware to high-level semantic for the scenario of Internet of Things, which allows you to manage, store and combine the data in order to generate broader context. To this end we present a reference architecture, which aims to show the major components of the Infostructure. Soon after a prototype is presented which is used to validate our work reaches the level of contextualization desired high level semantic as well as a performance evaluation, which aims to evaluate the behavior of the subsystem responsible for managing contextual information on a large amount of data. After statistical analysis is performed with the results obtained in the evaluation. Finally, the conclusions of the work and some problems such as no assurance as to the integrity of the sensory data coming Infostructure, and future work that takes into account the implementation of other modules so that we can conduct tests in real environments are presented.
Resumo:
Cloud Computing is a paradigm that enables the access, in a simple and pervasive way, through the network, to shared and configurable computing resources. Such resources can be offered on demand to users in a pay-per-use model. With the advance of this paradigm, a single service offered by a cloud platform might not be enough to meet all the requirements of clients. Ergo, it is needed to compose services provided by different cloud platforms. However, current cloud platforms are not implemented using common standards, each one has its own APIs and development tools, which is a barrier for composing different services. In this context, the Cloud Integrator, a service-oriented middleware platform, provides an environment to facilitate the development and execution of multi-cloud applications. The applications are compositions of services, from different cloud platforms and, represented by abstract workflows. However, Cloud Integrator has some limitations, such as: (i) applications are locally executed; (ii) users cannot specify the application in terms of its inputs and outputs, and; (iii) experienced users cannot directly determine the concrete Web services that will perform the workflow. In order to deal with such limitations, this work proposes Cloud Stratus, a middleware platform that extends Cloud Integrator and offers different ways to specify an application: as an abstract workflow or a complete/partial execution flow. The platform enables the application deployment in cloud virtual machines, so that several users can access it through the Internet. It also supports the access and management of virtual machines in different cloud platforms and provides services monitoring mechanisms and assessment of QoS parameters. Cloud Stratus was validated through a case study that consists of an application that uses different services provided by different cloud platforms. Cloud Stratus was also evaluated through computing experiments that analyze the performance of its processes.
Resumo:
Digital image segmentation is the process of assigning distinct labels to different objects in a digital image, and the fuzzy segmentation algorithm has been used successfully in the segmentation of images from several modalities. However, the traditional fuzzy segmentation algorithm fails to segment objects that are characterized by textures whose patterns cannot be successfully described by simple statistics computed over a very restricted area. In this paper we present an extension of the fuzzy segmentation algorithm that achieves the segmentation of textures by employing adaptive affinity functions as long as we extend the algorithm to tridimensional images. The adaptive affinity functions change the size of the area where they compute the texture descriptors, according to the characteristics of the texture being processed, while three dimensional images can be described as a finite set of two-dimensional images. The algorithm then segments the volume image with an appropriate calculation area for each texture, making it possible to produce good estimates of actual volumes of the target structures of the segmentation process. We will perform experiments with synthetic and real data in applications such as segmentation of medical imaging obtained from magnetic rosonance
Resumo:
Compatibility testing between a drilling fluid and a cement slurry is one of the steps before an operation of cementing oil wells. This test allows us to evaluate the main effects that contamination of these two fluids may cause the technological properties of a cement paste. The interactions between cement paste and drilling fluid, because its different chemical compositions, may affect the cement hydration reactions, damaging the cementing operation. Thus, we carried out the study of the compatibility of non-aqueous drilling fluid and a cement slurry additives. The preparation procedures of the non-aqueous drilling fluid, the cement paste and completion of compatibility testing were performed as set out by the oil industry standards. In the compatibility test is evaluated rheological properties, thickening time, stability and compressive strength of cement pastes. We also conducted analyzes of scanning electron microscopy and X-ray diffraction of the mixture obtained by the compatibility test to determine the microstructural changes in cement pastes. The compatibility test showed no visual changes in the properties of the cement paste, as phase separation. However, after the addition of nonaqueous drilling fluid to cement slurry there was an increased amount of plastic viscosity, the yield point and gel strength. Among the major causative factors can include: chemical reaction of the components present in the non-aqueous drilling fluid as the primary emulsifier, wetting agent and paraffin oil, with the chemical constituents of the cement. There was a reduction in the compressive strength of the cement paste after mixing with this drilling fluid. Thickening test showed that the oil wetting agent and high salinity of the non-aqueous fluid have accelerating action of the handle of the cement paste time. The stability of the cement paste is impaired to the extent that there is increased contamination of the cement slurry with the nonaqueous fluid. The X-ray diffraction identified the formation of portlandite and calcium silicate in contaminated samples. The scanning electron microscopy confirmed the development of the identified structures in the X-ray diffraction and also found the presence of wells in the cured cement paste. The latter, formed by the emulsion stability of the drilling fluid in the cement paste, corroborate the reduction of mechanical strength. The oil wetting agent component of the non-aqueous drilling fluid, the modified cement hydration processes, mainly affecting the setting time.
Resumo:
Produced water is a major problem associated with the crude oil extraction activity. The monitoring of the levels of metals in the waste is constant and requires the use of sensitive analytical techniques. However, the determination of trace elements can often require a pre-concentration step. The objective of this study was to develop a simple and rapid analytical method for the extraction and pre-concentration based on extraction phenomenon cloud point for the determination of Cd, Pb and Tl in produced water samples by spectrometry of high resolution Absorption source continues and atomization graphite furnace. The Box Behnken design was used to obtain the optimal condition of extraction of analytes. The factors were evaluated: concentration of complexing agent (o,o-dietilditilfosfato ammonium, DDTP), the concentration of hydrochloric acid and concentration of surfactant (Triton X -114). The optimal condition obtained through extraction was: 0,6% m v-1 DDTP, HCl 0,3 mol L-1 and 0,2% m v-1 of Triton X - 114 for Pb; 0,7% m v-1 DDTP, HCl 0,8 mol L-1 and 0,2% m v-1 Triton X-114 for Cd. For Tl was evidenced that best extraction condition occurs with no DDTP, the extraction conditions were HCl 1,0 mol L-1 e 1,0% m v-1 de Triton X - 114. The limits of detection for the proposed method were 0,005 µg L-1 , 0,03 µg L-1 and 0,09 µg L-1 to Cd, Pb and Tl, Respectively. Enrichment factors Were greater than 10 times. The method was applied to the water produced in the Potiguar basin, and addition and recovery tests were performed, and values were between 81% and 120%. The precision was expressed with relative standard deviation (RSD) is less than 5%
Resumo:
In this study we evaluated the capacity removal of PAHs in an oily solution between the bentonite hydrofobized with linseed oil and paraffin with natural bentonite. Analyses of natural bentonite and hydrofobized were made by the characterization techniques: (1) Thermogravimetric Analysis (TGA), which aimed to evaluate the thermal events due to mass loss, both associated with the exit of moisture and decomposition of clay as due to hidrofobizante loss agent. (2) Analysis of X-ray diffraction (XRD) in order to determine the mineralogical phases that make up the structure of clay and (3) Spectrophotometry in the infrared region used to characterize the functional groups of both the matrix mineral (bentonite) and the hidrofobizantes agents (linseed oil and paraffin). We used a factorial design 24 with the following factors; hidrofobizante, percent hidrofobizante, adsorption time and volume of the oily solution. Analyzing the factorial design 24 was seen that none of the factors apparently was more important than the others and, as all responses showed significant values in relation to the ability of oil removal was not possible to evaluate a difference in the degree of efficiency the two hidrofobizantes. For the new study compared the efficiency of the modified clay, with each hidrofobizante separately in relation to their natural form. As such, there are four new factorial designs 23 using natural bentonite as a differentiating factor. The factors used were bentonite (with and without hydrophobization), exposure time of the adsorbent material to the oily solution and volume of an oily solution, trying to interpret how these factors could influence the process of purifying water contaminated with PAHs. Was employed as a technique for obtaining responses to fluorescence spectroscopy, as already known from literature that PAHs, for presenting combined chains due to condensation of the aromatic rings fluoresce quite similar when excited in the ultraviolet region and as an auxiliary technique to gas chromatography / mass spectrometry (GC-MS) used for the analysis of PAHs in order to complement the study of fluorescence spectroscopy, since the spectroscopic method only allows you an idea of total number of fluorescent species contained in the oil soluble. The result shows an excellent adsorption of PAHs and other fluorescent species assigned to the main effect of the first factor, hydrophobization for the first planning 23 BNTL 5%, for 93% the sixth stop in the second test (+-+),factorial design 23 BNTL 10%, the fourth test (++-) with 94.5% the third factorial design 23 BNTP 5%, the second test (+--) with 91% and the fourth and final planning 23 BNTP 10%, the last test ( + + +) with 88%. Compared with adsorption of bentonite in its natural form. This work also shows the maximum adsorption of each hidrofobizante
Resumo:
The preparation of nanostructured materials using natural clays as support, has been studied in literature under the same are found in nature and consequently, have a low price. Generally, clays serve as supports for metal oxides by increasing the number of active sites present on the surface and can be applied for various purposes such as adsorption, catalysis and photocatalysis. Some of the materials that are currently highlighted are niobium compounds, in particular, its oxides, by its characteristics such as high acidity, rigidity, water insolubility, oxidative and photocatalytic properties. In this scenario, the study aimed preparing a composite material oxyhydroxide niobium (NbO2OH) / sodium vermiculite clay and evaluate its effectiveness with respect to the natural clay (V0) and NbO2OH. The composite was prepared by precipitation-deposition method and then characterized by X-ray diffraction, infrared spectroscopy (XRD), energy dispersive X-ray (EDS), thermal analysis (TG/DTG), scanning electron microscopy (SEM), N2 adsorption-desorption and investigation of distribution of load. The application of the material NbO2OH/V0 was divided in two steps: first through oxidation and adsorption methods, and second through photocatalytic activity using solar irradiation. Studies of adsorption, oxidation and photocatalytic oxidation monitored the percentage of color removal from the dye methylene blue (MB) by UV-Vis spectroscopy. The XRD showed a decrease in reflection d (001) clay after modification; the FTIR indicated the presence of both the clay when the oxyhydroxide niobium to present bands in 1003 cm-1 related to Si-O stretching bands and 800 cm-1 to the Nb-O stretching. The presence of niobium was also confirmed by EDS indicated that 17 % by mass amount of the metal. Thermal analysis showed thermal stability of the composite at 217 °C and micrographs showed that there was a decrease in particle size. The investigation of the surface charge of NbO2OH/V0 found that the material exhibits a heterogeneous surface with average low and high negative charges. Adsorption tests showed that the composite NbO2OH/V0 higher adsorption capacity to remove 56 % of AM, while the material removed from V0 only 13 % showed no NbO2OH and adsorptive capacity due to the formation of H-aggregates. The percent removal of dye color for the oxidation tests showed little difference from the adsorption, being 18 and 66 % removal of dye color for V0 and NbO2OH/V0 respectively. The NbO2OH/V0 material shows excellent photocatalytic activity managing to remove just 95,5 % in 180 minutes of the color of MB compared to 41,4 % and 82,2 % of V0 the NbO2OH, proving the formation of a new composite with distinct properties of its precursors.
Resumo:
During the drilling of oil and natural gas are generated solid waste, liquid and gaseous. These solid fragments, which are known as cuttings, are carried to the surface through the drilling fluid. Furthermore, this fluid serves to cool the bit, keeping the internal pressure of the well, and others. This solid residue is very polluting, because it has incorporated beyond the drilling fluid, which has several chemical additives harmful to the environment, some heavy metals that are harmful to the environment, such as lead. To minimize the residue generated, are currently being studied numerous techniques to mitigate the problems that such waste can cause to the environment, like addition of cuttings in the composition of soil cement brick masonry construction, addition of cuttings on the clay matrix for the manufacture of solid masonry bricks and ceramic blocks and coprocessing of the cuttings in cement. So, the main objective of this work is the incorporation of cuttings drilling of oil wells, the cement slurry used in the cementing operation of the well. This cuttings used in this study, arising from the formation Pendências, was milled and separated in a sieve of 100 mesh. After grinding had a mean particle sike in order of 86 mm and crystal structure containing phases of quartz and calcite type, characteristic of the Portland cement. Were formulated and prepared slurries of cement with density 13 lb / gal, containing different concentrations of gravel, and realized characterization tests API SPEC 10A and RP 10B. Free water tests showed values lower than 5.9% and the rheological model that best described the behavior of the mixtures was the power. The results of compressive strength (10.3 MPa) and stability (Dr <0.5 lb / gal) had values within the set of operational procedures. Thus, the gravel from the drilling operation, may be used as binders in addition to Portland cement oil wells, in order to reuse this waste and reduce the cost of the cement paste.
Resumo:
The determination and monitoring of metallic contaminants in water is a task that must be continuous, leading to the importance of the development, modification and optimization of analytical methodologies capab le of determining the various metal contaminants in natural environments, because, in many cases, the ava ilable instrumentation does not provide enough sensibility for the determination of trace values . In this study, a method of extraction and pre- concentration using a microemulsion system with in the Winsor II equilibrium was tested and optimized for the determination of Co, Cd, P b, Tl, Cu and Ni through the technique of high- resolution atomic absorption spectrometry using a continuum source (HR-CS AAS). The optimization of the temperature program for the graphite furnace (HR-CS AAS GF) was performed through the pyrolysis and atomization curves for the analytes Cd, Pb, Co and Tl with and without the use of different chemical modifiers. Cu and Ni we re analyzed by flame atomization (HR-CS F AAS) after pre-concentr ation, having the sample introduction system optimized for the realization of discrete sampling. Salinity and pH levels were also analyzed as influencing factors in the efficiency of the extraction. As final numbers, 6 g L -1 of Na (as NaCl) and 1% of HNO 3 (v/v) were defined. For the determination of the optimum extraction point, a centroid-simplex statistical plan was a pplied, having chosen as the optimum points of extraction for all of the analytes, the follo wing proportions: 70% aqueous phase, 10% oil phase and 20% co-surfactant/surfactant (C/S = 4). After extraction, the metals were determined and the merit figures obtained for the proposed method were: LOD 0,09, 0,01, 0,06, 0,05, 0,6 and 1,5 μg L -1 for Pb, Cd, Tl, Co, Cu and Ni, re spectively. Line ar ranges of ,1- 2,0 μg L -1 for Pb, 0,01-2,0 μg L -1 for Cd, 1,0 - 20 μg L -1 for Tl, 0,1-5,0 μg L -1 for Co, 2-200 μg L -1 and for Cu e Ni 5-200 μg L -1 were obtained. The enrichment factors obtained ranged between 6 and 19. Recovery testing with the certified sample show ed recovery values (n = 3, certified values) after extraction of 105 and 101, 100 and 104% for Pb, Cd, Cu and Ni respectively. Samples of sweet waters of lake Jiqui, saline water from Potengi river and water produced from the oil industry (PETROBRAS) were spiked and the recovery (n = 3) for the analytes were between 80 and 112% confirming th at the proposed method can be used in the extraction. The proposed method enabled the sepa ration of metals from complex matrices, and with good pre-concentration factor, consistent with the MPV (allowed limits) compared to CONAMA Resolution No. 357/2005 which regulat es the quality of fresh surface water, brackish and saline water in Brazil.
Resumo:
Heavy metals are present in industrial waste. These metals can generate a large environmental impact contaminating water, soil and plants. The chemical action of heavy metals has attracted environmental interest. In this context, this study aimed to test t he performance of electrochemical technologies for removing and quantifying heavy metals. First ly , the electroanalytical technique of stripping voltammetry with glassy carbon electrode (GC) was standardized in order to use this method for the quantificatio n of metals during their removal by electrocoagulation process (EC). A nalytical curves were evaluated to obtain reliability of the determin ation and quantification of Cd 2+ and Pb 2+ separately or in a mixture. Meanwhile , EC process was developed using an el ectrochemical cell in a continuous flow (EFC) for removing Pb 2+ and Cd 2+ . The se experiments were performed using Al parallel plates with 10 cm of diameter ( 63.5 cm 2 ) . The optimization of conditions for removing Pb 2+ and Cd 2+ , dissolved in 2 L of solution at 151 L h - 1 , were studied by applying different values of current for 30 min. Cd 2+ and Pb 2+ concentrations were monitored during electrolysis using stripping voltammetry. The results showed that the removal of Pb 2 + was effective when the EC pro cess is used, obtaining removals of 98% in 30 min. This behavior is dependent on the applied current, which implies an increase in power consumption. From the results also verified that the stripping voltammetry technique is quite reliable deter mining Pb 2+ concentration , when compared with the measurements obtained by atomic absorption method (AA). In view of this, t he second objective of this study was to evaluate the removal of Cd 2+ and Pb 2+ (mixture solution) by EC . Removal efficiency increasing current was confirmed when 93% and 100% of Cd 2+ and Pb 2+ was removed after 30 min . The increase in the current promotes the oxidation of sacrificial electrodes, and consequently increased amount of coagulant, which influences the removal of heavy metals in solution. Adsortive voltammetry is a fast, reliable, economical and simple way to determine Cd 2+ and Pb 2+ during their removal. I t is more economical than those normally used, which require the use of toxic and expensive reagents. Our results demonstrated the potential use of electroanalytical techniques to monitor the course of environmental interventions. Thus, the application of the two techniques associated can be a reliable way to monitor environmental impacts due to the pollution of aquatic ecosystems by heavy metals.
Resumo:
Inside the Borborema Province the Northwestern Ceará (NC) is one of the most seismic active regions. There are reports of an earthquake occurred in 1810 in the Granja town. On January, 2008 the seismic activity in NC has increased and it was deployed a seismographic network with 11 digital stations. In 2009, another earthquake sequence began and it was deployed another seismographic network in the Santana do Acaraú town with 6 stations. This thesis presents the results obtained by analyzing the data recorded in these two networks. The epicentral areas are located near the northeastern part of the Transbrasiliano Lineament, a shear zone with NE-SW-trending that cuts the study area. The hypocenters are located between 1km and 8km. The strike-slip focal mechanisms were found, which is predominant in the Borborema Province. An integration of seismological, geological and geophysical data was performed and it show that the seismogenic faults found are oriented in the same direction to the local brittle structures observed in field and magnetic lineaments. The SHmax (maximum compressional stress) direction in NC was estimated using an inversion of seven focal mechanisms. The horizontal maximum compression stress (σ1 = 300°) with orientation NW-SE and extension (σ3 = 210°) with NE-SW and σ2 vertical. These results are consistent with results of previous studies. The seismic activity recorded in NC is not related to a possible reactivation of the Transbrasiliano Lineament, by now.