9 resultados para Hyperbolic smoothing

em Universidade Federal do Rio Grande do Norte(UFRN)


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Originally aimed at operational objectives, the continuous measurement of well bottomhole pressure and temperature, recorded by permanent downhole gauges (PDG), finds vast applicability in reservoir management. It contributes for the monitoring of well performance and makes it possible to estimate reservoir parameters on the long term. However, notwithstanding its unquestionable value, data from PDG is characterized by a large noise content. Moreover, the presence of outliers within valid signal measurements seems to be a major problem as well. In this work, the initial treatment of PDG signals is addressed, based on curve smoothing, self-organizing maps and the discrete wavelet transform. Additionally, a system based on the coupling of fuzzy clustering with feed-forward neural networks is proposed for transient detection. The obtained results were considered quite satisfactory for offshore wells and matched real requisites for utilization

Relevância:

10.00% 10.00%

Publicador:

Resumo:

One of the greatest challenges of demography, nowadays, is to obtain estimates of mortality, in a consistent manner, mainly in small areas. The lack of this information, hinders public health actions and leads to impairment of quality of classification of deaths, generating concern on the part of demographers and epidemiologists in obtaining reliable statistics of mortality in the country. In this context, the objective of this work is to obtain estimates of deaths adjustment factors for correction of adult mortality, by States, meso-regions and age groups in the northeastern region, in 2010. The proposal is based on two lines of observation: a demographic one and a statistical one, considering also two areas of coverage in the States of the Northeast region, the meso-regions, as larger areas and counties, as small areas. The methodological principle is to use the General Equation and Balancing demographic method or General Growth Balance to correct the observed deaths, in larger areas (meso-regions) of the states, since they are less prone to breakage of methodological assumptions. In the sequence, it will be applied the statistical empirical Bayesian estimator method, considering as sum of deaths in the meso-regions, the death value corrected by the demographic method, and as reference of observation of smaller area, the observed deaths in small areas (counties). As results of this combination, a smoothing effect on the degree of coverage of deaths is obtained, due to the association with the empirical Bayesian Estimator, and the possibility of evaluating the degree of coverage of deaths by age groups at counties, meso-regions and states levels, with the advantage of estimete adjustment factors, according to the desired level of aggregation. The results grouped by State, point to a significant improvement of the degree of coverage of deaths, according to the combination of the methods with values above 80%. Alagoas (0.88), Bahia (0.90), Ceará (0.90), Maranhão (0.84), Paraíba (0.88), Pernambuco (0.93), Piauí (0.85), Rio Grande do Norte (0.89) and Sergipe (0.92). Advances in the control of the registry information in the health system, linked to improvements in socioeconomic conditions and urbanization of the counties, in the last decade, provided a better quality of information registry of deaths in small areas

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The present work had as objective to apply an experimental planning aiming at to improve the efficiency of separation of a new type of mixer-settler applied to treat waste water contaminated with oil. An unity in scale of laboratory, was installed in the Post-graduation Program of Chemical Engineering of UFRN. It was constructed in partnership with Petrobras S.A. This called device Misturador-Decantador a Inversão de Fases (MDIF) , possess features of conventional mixer-settler and spray column type. The equipment is composed of three main parts: mixing chamber; chamber of decantation and chamber of separation. The efficiency of separation is evaluated analyzing the oil concentrations in water in the feed and the output of the device. For the analysis one used the gravimetric method of oil and greases analysis (TOG). The system in study is a water of formation emulsified with oil. The used extractant is a mixture of Turpentine spirit hydro-carbons, supplied for Petrobras. It was applied, for otimization of the efficiency of separation of the equipment, an experimental planning of the composite central type, having as factorial portion fractionary factorial planning 2 5-2, with the magnifying of the type star and five replications in the central point. In this work, the following independents variables were studied: contents of oil in the feed of the device; volumetric ratio (O/A); total flowrate ; agitation in the mixing chamber and height of the organic bed. Minimum and maximum limits for the studied variables had been fixed according previous works. The analysis of variance for the equation of the empirical model, revealed statistically significant and useful results for predictions ends. The variance analysis also presented the distribution of the error as a normal distribution and was observed that as the dispersions do not depend on the levels of the factors, the independence assumption can be verified. The variation around the average is explained by 98.98%, or either, equal to the maximum value, being the smoothing of the model in relation to the experimental points of 0,98981. The results present a strong interaction between the variable oil contents in the feed and agitation in the mixing chamber, having great and positive influence in the separation efficiency. Another variable that presented a great positive influence was the height of the organic bed. The best results of separation efficiency had been obtained for high flowrates when associates the high oil concentrations and high agitation. The results of the present work had shown excellent agreement with the results carried out through previous works with the mixer-settler of phase inversion

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A chemical process optimization and control is strongly correlated with the quantity of information can be obtained from the system. In biotechnological processes, where the transforming agent is a cell, many variables can interfere in the process, leading to changes in the microorganism metabolism and affecting the quantity and quality of final product. Therefore, the continuously monitoring of the variables that interfere in the bioprocess, is crucial to be able to act on certain variables of the system, keeping it under desirable operational conditions and control. In general, during a fermentation process, the analysis of important parameters such as substrate, product and cells concentration, is done off-line, requiring sampling, pretreatment and analytical procedures. Therefore, this steps require a significant run time and the use of high purity chemical reagents to be done. In order to implement a real time monitoring system for a benchtop bioreactor, these study was conducted in two steps: (i) The development of a software that presents a communication interface between bioreactor and computer based on data acquisition and process variables data recording, that are pH, temperature, dissolved oxygen, level, foam level, agitation frequency and the input setpoints of the operational parameters of the bioreactor control unit; (ii) The development of an analytical method using near-infrared spectroscopy (NIRS) in order to enable substrate, products and cells concentration monitoring during a fermentation process for ethanol production using the yeast Saccharomyces cerevisiae. Three fermentation runs were conducted (F1, F2 and F3) that were monitored by NIRS and subsequent sampling for analytical characterization. The data obtained were used for calibration and validation, where pre-treatments combined or not with smoothing filters were applied to spectrum data. The most satisfactory results were obtained when the calibration models were constructed from real samples of culture medium removed from the fermentation assays F1, F2 and F3, showing that the analytical method based on NIRS can be used as a fast and effective method to quantify cells, substrate and products concentration what enables the implementation of insitu real time monitoring of fermentation processes

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This present research the aim to show to the reader the Geometry non-Euclidean while anomaly indicating the pedagogical implications and then propose a sequence of activities, divided into three blocks which show the relationship of Euclidean geometry with non-Euclidean, taking the Euclidean with respect to analysis of the anomaly in non-Euclidean. PPGECNM is tied to the line of research of History, Philosophy and Sociology of Science in the Teaching of Natural Sciences and Mathematics. Treat so on Euclid of Alexandria, his most famous work The Elements and moreover, emphasize the Fifth Postulate of Euclid, particularly the difficulties (which lasted several centuries) that mathematicians have to understand him. Until the eighteenth century, three mathematicians: Lobachevsky (1793 - 1856), Bolyai (1775 - 1856) and Gauss (1777-1855) was convinced that this axiom was correct and that there was another geometry (anomalous) as consistent as the Euclid, but that did not adapt into their parameters. It is attributed to the emergence of these three non-Euclidean geometry. For the course methodology we started with some bibliographical definitions about anomalies, after we ve featured so that our definition are better understood by the readers and then only deal geometries non-Euclidean (Hyperbolic Geometry, Spherical Geometry and Taxicab Geometry) confronting them with the Euclidean to analyze the anomalies existing in non-Euclidean geometries and observe its importance to the teaching. After this characterization follows the empirical part of the proposal which consisted the application of three blocks of activities in search of pedagogical implications of anomaly. The first on parallel lines, the second on study of triangles and the third on the shortest distance between two points. These blocks offer a work with basic elements of geometry from a historical and investigative study of geometries non-Euclidean while anomaly so the concept is understood along with it s properties without necessarily be linked to the image of the geometric elements and thus expanding or adapting to other references. For example, the block applied on the second day of activities that provides extend the result of the sum of the internal angles of any triangle, to realize that is not always 180° (only when Euclid is a reference that this conclusion can be drawn)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This dissertation aims at investigating the book Ariel (1965), written by Sylvia Plath, as a kind of performative and ritual poetry that fragments and reconstructs the personal experience, manipulating the memory of the autobiographical body as a way to rehearse and restore subjectivity. We propose that, in Ariel, the hyperbolic, transcendent and parodic transfiguration of real episodes, used as literary substance, corrupts and subverts the specular idea of a confessional truth usually related to the writer s work. Our objective is to examine signs of confluence between Sylvia Plath s poetry and performance art, departing from de idea that the spectacularization of the self, the exhibition of private rituals, the theatricalization of autobiographical circumstances and the undressing of one s craziness and vulnerability are mutual procedures to the poet and the perfomer. Simultaneously unfolding between the inside and the outside of the poem, Sylvia Plath s real suicide and the death and rebirth rituals performed in the literary text appear as symbolic elements that might reveal the performer s liminal space, where reality and representation coexist, and where the performative testimony does not frame only the real subject s body but also his/her infinite possibilities of being restored through art.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The recent observational advances of Astronomy and a more consistent theoretical framework turned Cosmology in one of the most exciting frontiers of contemporary science. In this thesis, homogeneous and inhomogeneous Universe models containing dark matter and different kinds of dark energy are confronted with recent observational data. Initially, we analyze constraints from the existence of old high redshift objects, Supernovas type Ia and the gas mass fraction of galaxy clusters for 2 distinct classes of homogeneous and isotropic models: decaying vacuum and X(z)CDM cosmologies. By considering the quasar APM 08279+5255 at z = 3.91 with age between 2-3 Gyr, we obtain 0,2 < OM < 0,4 while for the j3 parameter which quantifies the contribution of A( t) is restricted to the intervalO, 07 < j3 < 0,32 thereby implying that the minimal age of the Universe amounts to 13.4 Gyr. A lower limit to the quasar formation redshift (zJ > 5,11) was also obtained. Our analyzes including flat, closed and hyperbolic models show that there is no an age crisis for this kind of decaying A( t) scenario. Tests from SN e Ia and gas mass fraction data were realized for flat X(z)CDM models. For an equation of state, úJ(z) = úJo + úJIZ, the best fit is úJo = -1,25, úJl = 1,3 and OM = 0,26, whereas for models with úJ(z) = úJo+úJlz/(l+z), we obtainúJo = -1,4, úJl = 2,57 and OM = 0,26. In another line of development, we have discussed the influence of the observed inhomogeneities by considering the Zeldovich-Kantowski-DyerRoeder (ZKDR) angular diameter distance. By applying the statistical X2 method to a sample of angular diameter for compact radio sources, the best fit to the cosmological parameters for XCDM models are OM = O, 26,úJ = -1,03 and a = 0,9, where úJ and a are the equation of state and the smoothness parameters, respectively. Such results are compatible with a phantom energy component (úJ < -1). The possible bidimensional spaces associated to the plane (a , OM) were restricted by using data from SNe Ia and gas mass fraction of galaxy clusters. For Supernovas the parameters are restricted to the interval 0,32 < OM < 0,5(20") and 0,32 < a < 1,0(20"), while to the gas mass fraction we find 0,18 < OM < 0,32(20") with alI alIowed values of a. For a joint analysis involving Supernovas and gas mass fraction data we obtained 0,18 < OM < 0,38(20"). In general grounds, the present study suggests that the influence of the cosmological inhomogeneities in the matter distribution need to be considered with more detail in the analyses of the observational tests. Further, the analytical treatment based on the ZKDR distance may give non-negligible corrections to the so-calIed background tests of FRW type cosmologies

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of this work is to derive theWard Identity for the low energy effective theory of a fermionic system in the presence of a hyperbolic Fermi surface coupled with a U(1) gauge field in 2+1 dimensions. These identities are important because they establish requirements for the theory to be gauge invariant. We will see that the identity associated Ward Identity (WI) of the model is not preserved at 1-loop order. This feature signalizes the presence of a quantum anomaly. In other words, a classical symmetry is broken dynamically by quantum fluctuations. Furthermore, we are considering that the system is close to a Quantum Phase Transitions and in vicinity of a Quantum Critical Point the fermionic excitations near the Fermi surface, decay through a Landau damping mechanism. All this ingredients need to be take explicitly to account and this leads us to calculate the vertex corrections as well as self energies effects, which in this way lead to one particle propagators which have a non-trivial frequency dependence

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this work calibration models were constructed to determine the content of total lipids and moisture in powdered milk samples. For this, used the near-infrared spectroscopy by diffuse reflectance, combined with multivariate calibration. Initially, the spectral data were submitted to correction of multiplicative light scattering (MSC) and Savitzsky-Golay smoothing. Then, the samples were divided into subgroups by application of hierarchical clustering analysis of the classes (HCA) and Ward Linkage criterion. Thus, it became possible to build regression models by partial least squares (PLS) that allowed the calibration and prediction of the content total lipid and moisture, based on the values obtained by the reference methods of Soxhlet and 105 ° C, respectively . Therefore, conclude that the NIR had a good performance for the quantification of samples of powdered milk, mainly by minimizing the analysis time, not destruction of the samples and not waste. Prediction models for determination of total lipids correlated (R) of 0.9955, RMSEP of 0.8952, therefore the average error between the Soxhlet and NIR was ± 0.70%, while the model prediction to content moisture correlated (R) of 0.9184, RMSEP, 0.3778 and error of ± 0.76%