19 resultados para Exponential integrators

em Universidade Federal do Rio Grande do Norte(UFRN)


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The cerium oxide has a high potential for use in removing pollutants after combustion, removal of organic matter in waste water and the fuel-cell technology. The nickel oxide is an attractive material due to its excellent chemical stability and their optical properties, electrical and magnetic. In this work, CeO2-NiO- systems on molars reasons 1:1(I), 1:2(II) e 1:3(III) metal-citric acid were synthesized using the Pechini method. We used techniques of TG / DTG and ATD to monitor the degradation process of organic matter to the formation of the oxide. By thermogravimetric analysis and applying the dynamic method proposed by Coats-Redfern, it was possible to study the reactions of thermal decomposition in order to propose the possible mechanism by which the reaction takes place, as well as the determination of kinetic parameters as activation energy, Ea, pre-exponential factor and parameters of activation. It was observed that both variables exert a significant influence on the formation of complex polymeric precursor. The model that best fitted the experimental data in the dynamic mode was R3, which consists of nuclear growth, which formed the nuclei grow to a continuous reaction interface, it proposes a spherical symmetry (order 2 / 3). The values of enthalpy of activation of the system showed that the reaction in the state of transition is exothermic. The variables of composition, together with the variable temperature of calcination were studied by different techniques such as XRD, IV and SEM. Also a study was conducted microstructure by the Rietveld method, the calculation routine was developed to run the package program FullProf Suite, and analyzed by pseudo-Voigt function. It was found that the molar ratio of variable metal-citric acid in the system CeO2-NiO (I), (II), (III) has strong influence on the microstructural properties, size of crystallites and microstrain network, and can be used to control these properties

Relevância:

10.00% 10.00%

Publicador:

Resumo:

It is presented the analysis of a retaining wall designed for the basement of a residential building, located in Natal/RN, which consists in a spaced pile wall, anchored by tiebacks, in sand. This structure was instrumented in order to measure the wall s horizontal movements and the load distribution throughout the anchor fixed length. The horizontal movements were measured with an inclinometer, and the loads in the anchors were measured with strain gages, installed in three places throughout the anchor fixed length. Measurements for displacement were done right after the implementation of each stage of the building and right after the conclusion of the building, and the measurements for loads in the anchors were done during the performance test, at the moment of the locking off and, also, right after the conclusion of the building. From the data of displacement were obtained velocity and acceleration data of wall. It was found that the time elapsed on braced installation was decisive in the magnitude of the displacements. The maximum horizontal displacement of wall ranged between 0,18 and 0,66% of the final depth of excavation. The loads in the anchors strongly reduced to approximately half the anchor fixed length, followed an exponential distribution. Furthermore, it was found that there was a loss of load in the anchors over time, reaching 50% loss in one of them

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The exponential growth in the applications of radio frequency (RF) is accompanied by great challenges as more efficient use of spectrum as in the design of new architectures for multi-standard receivers or software defined radio (SDR) . The key challenge in designing architecture of the software defined radio is the implementation of a wide-band receiver, reconfigurable, low cost, low power consumption, higher level of integration and flexibility. As a new solution of SDR design, a direct demodulator architecture, based on fiveport technology, or multi-port demodulator, has been proposed. However, the use of the five-port as a direct-conversion receiver requires an I/Q calibration (or regeneration) procedure in order to generate the in-phase (I) and quadrature (Q) components of the transmitted baseband signal. In this work, we propose to evaluate the performance of a blind calibration technique without additional knowledge about training or pilot sequences of the transmitted signal based on independent component analysis for the regeneration of I/Q five-port downconversion, by exploiting the information on the statistical properties of the three output signals

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work proposes a new technique for phasor estimation applied in microprocessor numerical relays for distance protection of transmission lines, based on the recursive least squares method and called least squares modified random walking. The phasor estimation methods have compromised their performance, mainly due to the DC exponential decaying component present in fault currents. In order to reduce the influence of the DC component, a Morphological Filter (FM) was added to the method of least squares and previously applied to the process of phasor estimation. The presented method is implemented in MATLABr and its performance is compared to one-cycle Fourier technique and conventional phasor estimation, which was also based on least squares algorithm. The methods based on least squares technique used for comparison with the proposed method were: forgetting factor recursive, covariance resetting and random walking. The techniques performance analysis were carried out by means of signals synthetic and signals provided of simulations on the Alternative Transient Program (ATP). When compared to other phasor estimation methods, the proposed method showed satisfactory results, when it comes to the estimation speed, the steady state oscillation and the overshoot. Then, the presented method performance was analyzed by means of variations in the fault parameters (resistance, distance, angle of incidence and type of fault). Through this study, the results did not showed significant variations in method performance. Besides, the apparent impedance trajectory and estimated distance of the fault were analysed, and the presented method showed better results in comparison to one-cycle Fourier algorithm

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The static and cyclic assays are common to test materials in structures.. For cycling assays to assess the fatigue behavior of the material and thereby obtain the S-N curves and these are used to construct the diagrams of living constant. However, these diagrams, when constructed with small amounts of S-N curves underestimate or overestimate the actual behavior of the composite, there is increasing need for more testing to obtain more accurate results. Therewith, , a way of reducing costs is the statistical analysis of the fatigue behavior. The aim of this research was evaluate the probabilistic fatigue behavior of composite materials. The research was conducted in three parts. The first part consists of associating the equation of probability Weilbull equations commonly used in modeling of composite materials S-N curve, namely the exponential equation and power law and their generalizations. The second part was used the results obtained by the equation which best represents the S-N curves of probability and trained a network to the modular 5% failure. In the third part, we carried out a comparative study of the results obtained using the nonlinear model by parts (PNL) with the results of a modular network architecture (MN) in the analysis of fatigue behavior. For this we used a database of ten materials obtained from the literature to assess the ability of generalization of the modular network as well as its robustness. From the results it was found that the power law of probability generalized probabilistic behavior better represents the fatigue and composites that although the generalization ability of the MN that was not robust training with 5% failure rate, but for values mean the MN showed more accurate results than the PNL model

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Global warming due to Greenhouse Gases (GHG) emissions, especially CO2, has been identified as one of the major problems of the twenty-first century, considering the consequences that could represent to planet. Currently, biological processes have been mentioned as a possible solution, especially CO2 biofixation due to association microalgae growth. This strategy has been emphasized as in addition to CO2 mitigation, occurs the production of biomass rich in compounds of high added value. The Microalgae show high photosynthetic capacity and growth rate higher than the superior plants, doubling its biomass in one day. Its culture does not show seasons, they grow in salt water and do not require irrigation, herbicides or pesticides. The lipid content of these microorganisms, depending on the species, may range from 10 to 70% of its dry weight, reaching 90% under certain culture conditions. Studies indicate that the most effective method to promote increased production of lipids in microalgae is to induce stress by limiting nitrogen content in the culture medium. These evidences justify research continuing the production of biofuels from microalgae. In this paper, it was studied the strategy of increasing the production of lipids in microalgae I. galbana with programmed nutritional stress, due to nitrogen limitation. The physiological responses of microalgae, grown in f / 2 with different concentrations of nitrogen (N: P 15,0-control, N: 5,0 P and N: P 2,5) were monitored. During exponential phase, results showed invariability in the studied conditions. However the cultures subjected to stress in stationary phase, showed lower biomass yields. There was an increase of 32,5% in carbohydrate content and 87.68% in lipids content at N: P ratio of 5,0 and an average decrease of 65% in protein content at N: P ratios of 5, 0 and 2.5. There were no significant variations in ash content, independently of cultivation and growth phase. Despite the limitation of biomass production in cultures with N: P smaller ratios, the increase of lipid accumulation highest lipids yields were observed as compared to the control culture. Given the increased concentration of lipids associated to stress, this study suggests the use of microalgae Isochrysis galbana as an alternative raw material for biofuel production

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study aimed to evaluate the potential use of smectite clays for color removal of textile effluents. The experiments were performed by testing exploratory/planning method factorial and fractional factorial where the factors and levels are predetermined. The smectite clays were used originating from gypsum hub of the region Araripe-PE, and the dye used was Reactive Yellow BF-4G 200%. The smectite clay was collected and transported to the Laboratory of Soil Physics of UFRPE, where it held its preparation through air drying, lump breaking and classification in sieve to then submit it to the adsorption process. Upon completion of 22 complete factorial design it was concluded that the values of (96, 96,5 and 95,8%) corresponding to the percentage of of removal for "in-kind", chemically and thermally activated, respectively and adsorbed amounts of (4,80, 4,61 and 4,74 mg/g) for three clays. Showed that the activation processes used did not increase the adsorption capacity of smectite clay. The kinetic data were best fitted to the Freundlich isotherm, with an exponential distribution of active sites and that shows above the Langmuir equation for adsorption of cations and anions by clays. The kinetic model that best adapted to the results was the pseudosecond order model. In the factorial design study 24-1, at concentrations up to 500 mg/L obtains high percentage of color removal (92,37, 90,92 and 93,40%) and adsorbed amount (230,94, 227,31 and 233,50 mg/g) for three clays. The kinetic data fitted well to Langmuir and Freundlich isotherms. The kinetic model that best adapted to the results was the pseudosecond order model

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The present work has as objective the knowledge of the process of drying of the cephalothorax of shrimp to give support the industry to make possible the use of this byproduct. In this sense, the process conditions in this tray dryer and spouted bed were analyzed. With these results, it was projected and constructs a dryer with specific characteristics for the drying of the cephalothorax. The desorption isotherms were obtained by the dynamic method in the temperatures of 20, 35 and 50º C and in the interval of 10-90% of relative humidity. It was observed that the product in form of powder can be conserved with larger stability for lower relative humidity to 40%. The curves of drying of the dryer of fixed bed were adjusted for the models: single exponential, biparametric exponential and Page. The model biparametric exponential more adequately described all the drying conditions studied. The tests carry out in spouted bed showed high drying rate for the material in the paste form in beds active dynamicly-fluid, provely the necessity of a feeding in shorter intervals of time to increase the thermal efficiency of the process. The projected dryer, be considered the obtained results, it was a rotary dryer with inert bed, feed co-current, discharge in cyclone to take place the separation gas-solid, and feed carry out in intervals of 2 minutes. The optimization of the equipment projected it was accomplished used the complete factorial experimental design 24, this had as independent variables temperature velocity of the air, feed flow rate and encapsulated concentration (albumin), as variables answers the thermal efficiency, the moisture content of obtained powder, total time of test and the efficiency of production of powder in several points of processing. The results showed that the rotary dryer with inert bed can present, also, good results if applied industrially

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The exponential figure of Gregório de Matos e Guerra has been subject of many theoretical discussions through the years, since his apparition in a public place, in the 19th century, and even more, during the 20th century, when he was salvaged by the modernist vanguard. As a result, there are yet two antagonist points of view linked to Gregório de Matos, on one side, there some researchers who defend him, on the other, some of them attack him. The first ones say this poet from Bahia was the first literary voice in Brazil, from the Baroque basis, while the last ones say he is a merely plagiarist of the Spanish poets from the 17th century, without a real contribution to the development of Brazilian Literature. With this in mind, this thesis follows the perspective this poet is an anthropophagus-baroque, devouring cultures, with an active participation in the process of our cultural and literary identity. For that reason, it was made a literature review about the biography of this poet trying to break romantic descriptions, emphasizing some scientific facts that can contribute to present the baroque profile of this poet. In this sense, it was discussed the History of Literature focused on this creole poet, mainly based on the historians point of view about the Gregorian poetry in the formation of Brazilian Literature scenery. In the defense of the hypothesis that Gregório de Matos was our first anthropophagus, this work aims to analyze how his poetry reveals the intrinsic characteristics of Baroque and Anthropophagy, focusing its carnivalesque aspect, showing to the world, with a satiric tone, the idiosyncrasies of human life. In this way, analyzing this corpus in Spanish is the strength of this thesis because, besides it is previously unpublished, it contributes to the comprehension of the anthropophagy as a theoretical mechanism that explains the process of formation of our cultural literary identity. Then, we have Augusto de Campos (1968; 1978; 1984; 1986; 1988), Haroldo de Campos (1976; 2010a; 2010b; 2011), Severo Sarduy ([1988?]), Oswald de Andrade (1945; 1978; 2006), Mikhail Bakhtin (2010), Octavio Paz (1979), Segismundo Spina (1980; 1995; 2008), Afrânio Coutinho (1986a; 1986b; 1994), Affonso Ávila (1994; 1997; 2004; 2008), among others, to constitute this theoretical scenery. The Gregorian poetry, in this way, have contributed to the formation of baroque-anthropophagic scenery in Brazilian boundaries, with a special attention to the transition of time, because he is not only from the 17th century, established by the historiography, but his work is present nowadays due to the contemporaneously of his themes, centered to the eternal doubts of baroque man

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the 20th century, the acupuncture has spread on occident as a complementary practice of heath care. This fact has motivated the international scientific community to invest in research that seek to understand why acupuncture works. In this work we compare statistically volt age fluctuation of bioelectric signals caught on the skin at an acupuncture point (IG 4) another nearby on acupuncture point. The acquisition of these signals was performed utilizing an electronic interface with a computer, which was based on an instrumentation amplifier designed with adequate specifications to this end. On the collected signals from a sample of 30 volunteers we have calculated major statistics and submitted them to pairing t-test with significance leveI a = O, 05. We have estimated to bioelectric signals the following parameters: standard deviation, asymmetry and curtose. Moreover, we have calculated the self-correlation function matched by on exponential curve we have observed that the signal decays more rapidly from a non-acupoint then from an acupoint. This fact is an indicative of the existence of information in the acupoint

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We investigate several diffusion equations which extend the usual one by considering the presence of nonlinear terms or a memory effect on the diffusive term. We also considered a spatial time dependent diffusion coefficient. For these equations we have obtained a new classes of solutions and studied the connection of them with the anomalous diffusion process. We start by considering a nonlinear diffusion equation with a spatial time dependent diffusion coefficient. The solutions obtained for this case generalize the usual one and can be expressed in terms of the q-exponential and q-logarithm functions present in the generalized thermostatistics context (Tsallis formalism). After, a nonlinear external force is considered. For this case the solutions can be also expressed in terms of the q-exponential and q-logarithm functions. However, by a suitable choice of the nonlinear external force, we may have an exponential behavior, suggesting a connection with standard thermostatistics. This fact reveals that these solutions may present an anomalous relaxation process and then, reach an equilibrium state of the kind Boltzmann- Gibbs. Next, we investigate a nonmarkovian linear diffusion equation that presents a kernel leading to the anomalous diffusive process. Particularly, our first choice leads to both a the usual behavior and anomalous behavior obtained through a fractionalderivative equation. The results obtained, within this context, correspond to a change in the waiting-time distribution for jumps in the formalism of random walks. These modifications had direct influence in the solutions, that turned out to be expressed in terms of the Mittag-Leffler or H of Fox functions. In this way, the second moment associated to these distributions led to an anomalous spread of the distribution, in contrast to the usual situation where one finds a linear increase with time

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this work, we study the survival cure rate model proposed by Yakovlev et al. (1993), based on a competing risks structure concurring to cause the event of interest, and the approach proposed by Chen et al. (1999), where covariates are introduced to model the risk amount. We focus the measurement error covariates topics, considering the use of corrected score method in order to obtain consistent estimators. A simulation study is done to evaluate the behavior of the estimators obtained by this method for finite samples. The simulation aims to identify not only the impact on the regression coefficients of the covariates measured with error (Mizoi et al. 2007) but also on the coefficients of covariates measured without error. We also verify the adequacy of the piecewise exponential distribution to the cure rate model with measurement error. At the end, model applications involving real data are made

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In Survival Analysis, long duration models allow for the estimation of the healing fraction, which represents a portion of the population immune to the event of interest. Here we address classical and Bayesian estimation based on mixture models and promotion time models, using different distributions (exponential, Weibull and Pareto) to model failure time. The database used to illustrate the implementations is described in Kersey et al. (1987) and it consists of a group of leukemia patients who underwent a certain type of transplant. The specific implementations used were numeric optimization by BFGS as implemented in R (base::optim), Laplace approximation (own implementation) and Gibbs sampling as implemented in Winbugs. We describe the main features of the models used, the estimation methods and the computational aspects. We also discuss how different prior information can affect the Bayesian estimates

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In last years it has talked a lot about the environment and the plastic waste produced and discarded. In last decades, the increasing development of research to obtain fuel from plastic material, by catalytic degradation, it has become a very attractive looking, as these tailings are discarded to millions worldwide. These materials take a long time to degrade themselves by ways said natural and burning it has not demonstrated a viable alternative due to the toxic products produced during combustion. Such products could bring serious consequences to public health and environment. Therefore, the technique of chemical recycling is presented as a suitable alternative, especially since could be obtain fractions of liquid fuels that can be intended to the petrochemical industry. This work aims to propose alternatives to the use of plastic waste in the production of light petrochemical. Zeolites has been widely used in the study of this process due to its peculiar structural properties and its high acidity. In this work was studied the reaction of catalytic degradation of high-density polyethylene (HDPE) in the presence HZSM-12 zeolites with different acid sites concentrations by thermogravimetry and pyrolysis coupled with GC-MS. The samples of the catalysts were mixed with HDPE in the proportion of 50% in mass and submitted to thermogravimetric analyses in several heating rates. The addition of solids with different acid sites concentrations to HDPE, produced a decrease in the temperature of degradation of the polymer proportional the acidity of the catalyst. These qualitative results were complemented by the data of activation energy obtained through the non-isothermal kinetics model proposed by Vyazovkin. The values of Ea when correlated to the data of surface acidity of the catalysts indicated that there is a exponential decrease of the energy of activation in the reaction of catalytic degradation of HDPE, in function of the concentration of acid sites of the materials. These results indicate that the acidity of the catalyst added to the system is one of the most important properties in the reaction of catalytic degradation of polyethylene

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Gels consist of soft materials with vast use in several activities, such as in pharmaceutical industry, food science, and coatings/textile applications. In order to obtain these materials, the process of gelification, that can be physical (based on physical interactions) and/or chemical (based on covalent crosslinking), has to be carried out. In this work we used dynamic light scattering (DLS) and rheometry to monitor the covalent gelification of chitosan solutions by glutaraldehyde. Intensity correlation function (ICF) data was obtained from DLS and the exponential stretched Kohrausch-William-Watts function (KWW) was fitted to them. The parameters of the KWW equation, β, Γ and C were evaluated. These methods were effective in clarifying the process of sol-gel transition, with the emergence of non-ergodicity, and determining the range of gelation observed in about 10-20 minutes. The dependence between apparent viscosity on reaction time was used to support the discussion proposed.