24 resultados para Modelización exponencial
em Universidade Federal do Rio Grande do Norte(UFRN)
Resumo:
The random walk models with temporal correlation (i.e. memory) are of interest in the study of anomalous diffusion phenomena. The random walk and its generalizations are of prominent place in the characterization of various physical, chemical and biological phenomena. The temporal correlation is an essential feature in anomalous diffusion models. These temporal long-range correlation models can be called non-Markovian models, otherwise, the short-range time correlation counterparts are Markovian ones. Within this context, we reviewed the existing models with temporal correlation, i.e. entire memory, the elephant walk model, or partial memory, alzheimer walk model and walk model with a gaussian memory with profile. It is noticed that these models shows superdiffusion with a Hurst exponent H > 1/2. We study in this work a superdiffusive random walk model with exponentially decaying memory. This seems to be a self-contradictory statement, since it is well known that random walks with exponentially decaying temporal correlations can be approximated arbitrarily well by Markov processes and that central limit theorems prohibit superdiffusion for Markovian walks with finite variance of step sizes. The solution to the apparent paradox is that the model is genuinely non-Markovian, due to a time-dependent decay constant associated with the exponential behavior. In the end, we discuss ideas for future investigations.
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
REGIS, Josiana Florencio Vieira; CAMPOS, Ana Celia Cavalcanti Fernandes. O paradigma tecnologico e a revoluçao informacional: fundamentos da sociedade da informaçao. In: CONGRESSO INTERNACIONAL EM SISTEMAS DE INFORMAÇAO E GESTAO DA TECNOLOGIA, 6., 2009. Sao Paulo. Anais eletronicos... Sao Paulo: FEA/USP, 2009. Trabalho oral.
Resumo:
O modelo civilizatório da sociedade global fundamenta-se na produção à larga escala e no aumento exponencial e diversificado do consumo. Este modelo impacta o meio ambiente já que demanda grandes quantidades de recursos naturais e provoca contaminação ambiental. No leque desta contaminação, a geração de resíduos sólidos surge como uma das principais devido a seus efeitos nocivos serem sentidos de forma imediata pelas pessoas. Em países como o Brasil, uma das soluções requeridas para se minimizar e/ou equacionar a problemática engendrada pelos resíduos sólidos é a reciclagem dos materiais. A justificativa oficial pelo esforço à reciclagem está nas características da atividade já que o uso de materiais reciclados reduz a demanda por recursos naturais em processos produtivos industriais, aumenta o tempo de vida útil dos aterros sanitários (local de destino final dos resíduos), além de gerar emprego e renda para os catadores, sujeitos que sobrevivem da coleta e separação dos materiais recicláveis. A partir de uma ética ambiental, a pergunta que deve ser feita quando nos propomos a analisar as implicações da geração dos resíduos é: por que a sociedade global gera resíduos sólidos de maneira acentuada? Contudo, à luz dos pressupostos mercadológicos do capitalismo, a pergunta que move as discussões acerca da problemática dos resíduos sólidos é: o que fazer com a crescente geração de resíduos sólidos? O presente artigo se propõe a uma reflexão dos elementos justificativos dessa ode à reciclagem. Em nossa perspectiva, a reciclagem fomenta ao que denominamos de ambientalismo econômico, no qual o discurso pró-reciclagem se apropria dos elementos e potencialidades ambientais da atividade da reciclagem para justificar as ações de caráter econômico no que se refere ao que fazer com os resíduos gerados diariamente.
Resumo:
The objective is to analyze the relationship between risk and number of stocks of a portfolio for an individual investor when stocks are chosen by "naive strategy". For this, we carried out an experiment in which individuals select actions to reproduce this relationship. 126 participants were informed that the risk of first choice would be an asset average of all standard deviations of the portfolios consist of a single asset, and the same procedure should be used for portfolios composed of two, three and so on, up to 30 actions . They selected the assets they want in their portfolios without the support of a financial analysis. For comparison we also tested a hypothetical simulation of 126 investors who selected shares the same universe, through a random number generator. Thus, each real participant is compensated for random hypothetical investor facing the same opportunity. Patterns were observed in the portfolios of individual participants, characterizing the curves for the components of the samples. Because these groupings are somewhat arbitrary, it was used a more objective measure of behavior: a simple linear regression for each participant, in order to predict the variance of the portfolio depending on the number of assets. In addition, we conducted a pooled regression on all observations by analyzing cross-section. The result of pattern occurs on average but not for most individuals, many of which effectively "de-diversify" when adding seemingly random bonds. Furthermore, the results are slightly worse using a random number generator. This finding challenges the belief that only a small number of titles is necessary for diversification and shows that there is only applicable to a large sample. The implications are important since many individual investors holding few stocks in their portfolios
Resumo:
The cerium oxide has a high potential for use in removing pollutants after combustion, removal of organic matter in waste water and the fuel-cell technology. The nickel oxide is an attractive material due to its excellent chemical stability and their optical properties, electrical and magnetic. In this work, CeO2-NiO- systems on molars reasons 1:1(I), 1:2(II) e 1:3(III) metal-citric acid were synthesized using the Pechini method. We used techniques of TG / DTG and ATD to monitor the degradation process of organic matter to the formation of the oxide. By thermogravimetric analysis and applying the dynamic method proposed by Coats-Redfern, it was possible to study the reactions of thermal decomposition in order to propose the possible mechanism by which the reaction takes place, as well as the determination of kinetic parameters as activation energy, Ea, pre-exponential factor and parameters of activation. It was observed that both variables exert a significant influence on the formation of complex polymeric precursor. The model that best fitted the experimental data in the dynamic mode was R3, which consists of nuclear growth, which formed the nuclei grow to a continuous reaction interface, it proposes a spherical symmetry (order 2 / 3). The values of enthalpy of activation of the system showed that the reaction in the state of transition is exothermic. The variables of composition, together with the variable temperature of calcination were studied by different techniques such as XRD, IV and SEM. Also a study was conducted microstructure by the Rietveld method, the calculation routine was developed to run the package program FullProf Suite, and analyzed by pseudo-Voigt function. It was found that the molar ratio of variable metal-citric acid in the system CeO2-NiO (I), (II), (III) has strong influence on the microstructural properties, size of crystallites and microstrain network, and can be used to control these properties
Resumo:
It is presented the analysis of a retaining wall designed for the basement of a residential building, located in Natal/RN, which consists in a spaced pile wall, anchored by tiebacks, in sand. This structure was instrumented in order to measure the wall s horizontal movements and the load distribution throughout the anchor fixed length. The horizontal movements were measured with an inclinometer, and the loads in the anchors were measured with strain gages, installed in three places throughout the anchor fixed length. Measurements for displacement were done right after the implementation of each stage of the building and right after the conclusion of the building, and the measurements for loads in the anchors were done during the performance test, at the moment of the locking off and, also, right after the conclusion of the building. From the data of displacement were obtained velocity and acceleration data of wall. It was found that the time elapsed on braced installation was decisive in the magnitude of the displacements. The maximum horizontal displacement of wall ranged between 0,18 and 0,66% of the final depth of excavation. The loads in the anchors strongly reduced to approximately half the anchor fixed length, followed an exponential distribution. Furthermore, it was found that there was a loss of load in the anchors over time, reaching 50% loss in one of them
Resumo:
Neste trabalho é proposto um novo algoritmo online para o resolver o Problema dos k-Servos (PKS). O desempenho desta solução é comparado com o de outros algoritmos existentes na literatura, a saber, os algoritmos Harmonic e Work Function, que mostraram ser competitivos, tornando-os parâmetros de comparação significativos. Um algoritmo que apresente desempenho eficiente em relação aos mesmos tende a ser competitivo também, devendo, obviamente, se provar o referido fato. Tal prova, entretanto, foge aos objetivos do presente trabalho. O algoritmo apresentado para a solução do PKS é baseado em técnicas de aprendizagem por reforço. Para tanto, o problema foi modelado como um processo de decisão em múltiplas etapas, ao qual é aplicado o algoritmo Q-Learning, um dos métodos de solução mais populares para o estabelecimento de políticas ótimas neste tipo de problema de decisão. Entretanto, deve-se observar que a dimensão da estrutura de armazenamento utilizada pela aprendizagem por reforço para se obter a política ótima cresce em função do número de estados e de ações, que por sua vez é proporcional ao número n de nós e k de servos. Ao se analisar esse crescimento (matematicamente, ) percebe-se que o mesmo ocorre de maneira exponencial, limitando a aplicação do método a problemas de menor porte, onde o número de nós e de servos é reduzido. Este problema, denominado maldição da dimensionalidade, foi introduzido por Belmann e implica na impossibilidade de execução de um algoritmo para certas instâncias de um problema pelo esgotamento de recursos computacionais para obtenção de sua saída. De modo a evitar que a solução proposta, baseada exclusivamente na aprendizagem por reforço, seja restrita a aplicações de menor porte, propõe-se uma solução alternativa para problemas mais realistas, que envolvam um número maior de nós e de servos. Esta solução alternativa é hierarquizada e utiliza dois métodos de solução do PKS: a aprendizagem por reforço, aplicada a um número reduzido de nós obtidos a partir de um processo de agregação, e um método guloso, aplicado aos subconjuntos de nós resultantes do processo de agregação, onde o critério de escolha do agendamento dos servos é baseado na menor distância ao local de demanda
Resumo:
This work proposes a new technique for phasor estimation applied in microprocessor numerical relays for distance protection of transmission lines, based on the recursive least squares method and called least squares modified random walking. The phasor estimation methods have compromised their performance, mainly due to the DC exponential decaying component present in fault currents. In order to reduce the influence of the DC component, a Morphological Filter (FM) was added to the method of least squares and previously applied to the process of phasor estimation. The presented method is implemented in MATLABr and its performance is compared to one-cycle Fourier technique and conventional phasor estimation, which was also based on least squares algorithm. The methods based on least squares technique used for comparison with the proposed method were: forgetting factor recursive, covariance resetting and random walking. The techniques performance analysis were carried out by means of signals synthetic and signals provided of simulations on the Alternative Transient Program (ATP). When compared to other phasor estimation methods, the proposed method showed satisfactory results, when it comes to the estimation speed, the steady state oscillation and the overshoot. Then, the presented method performance was analyzed by means of variations in the fault parameters (resistance, distance, angle of incidence and type of fault). Through this study, the results did not showed significant variations in method performance. Besides, the apparent impedance trajectory and estimated distance of the fault were analysed, and the presented method showed better results in comparison to one-cycle Fourier algorithm
Resumo:
The static and cyclic assays are common to test materials in structures.. For cycling assays to assess the fatigue behavior of the material and thereby obtain the S-N curves and these are used to construct the diagrams of living constant. However, these diagrams, when constructed with small amounts of S-N curves underestimate or overestimate the actual behavior of the composite, there is increasing need for more testing to obtain more accurate results. Therewith, , a way of reducing costs is the statistical analysis of the fatigue behavior. The aim of this research was evaluate the probabilistic fatigue behavior of composite materials. The research was conducted in three parts. The first part consists of associating the equation of probability Weilbull equations commonly used in modeling of composite materials S-N curve, namely the exponential equation and power law and their generalizations. The second part was used the results obtained by the equation which best represents the S-N curves of probability and trained a network to the modular 5% failure. In the third part, we carried out a comparative study of the results obtained using the nonlinear model by parts (PNL) with the results of a modular network architecture (MN) in the analysis of fatigue behavior. For this we used a database of ten materials obtained from the literature to assess the ability of generalization of the modular network as well as its robustness. From the results it was found that the power law of probability generalized probabilistic behavior better represents the fatigue and composites that although the generalization ability of the MN that was not robust training with 5% failure rate, but for values mean the MN showed more accurate results than the PNL model
Resumo:
Global warming due to Greenhouse Gases (GHG) emissions, especially CO2, has been identified as one of the major problems of the twenty-first century, considering the consequences that could represent to planet. Currently, biological processes have been mentioned as a possible solution, especially CO2 biofixation due to association microalgae growth. This strategy has been emphasized as in addition to CO2 mitigation, occurs the production of biomass rich in compounds of high added value. The Microalgae show high photosynthetic capacity and growth rate higher than the superior plants, doubling its biomass in one day. Its culture does not show seasons, they grow in salt water and do not require irrigation, herbicides or pesticides. The lipid content of these microorganisms, depending on the species, may range from 10 to 70% of its dry weight, reaching 90% under certain culture conditions. Studies indicate that the most effective method to promote increased production of lipids in microalgae is to induce stress by limiting nitrogen content in the culture medium. These evidences justify research continuing the production of biofuels from microalgae. In this paper, it was studied the strategy of increasing the production of lipids in microalgae I. galbana with programmed nutritional stress, due to nitrogen limitation. The physiological responses of microalgae, grown in f / 2 with different concentrations of nitrogen (N: P 15,0-control, N: 5,0 P and N: P 2,5) were monitored. During exponential phase, results showed invariability in the studied conditions. However the cultures subjected to stress in stationary phase, showed lower biomass yields. There was an increase of 32,5% in carbohydrate content and 87.68% in lipids content at N: P ratio of 5,0 and an average decrease of 65% in protein content at N: P ratios of 5, 0 and 2.5. There were no significant variations in ash content, independently of cultivation and growth phase. Despite the limitation of biomass production in cultures with N: P smaller ratios, the increase of lipid accumulation highest lipids yields were observed as compared to the control culture. Given the increased concentration of lipids associated to stress, this study suggests the use of microalgae Isochrysis galbana as an alternative raw material for biofuel production
Resumo:
This study aimed to evaluate the potential use of smectite clays for color removal of textile effluents. The experiments were performed by testing exploratory/planning method factorial and fractional factorial where the factors and levels are predetermined. The smectite clays were used originating from gypsum hub of the region Araripe-PE, and the dye used was Reactive Yellow BF-4G 200%. The smectite clay was collected and transported to the Laboratory of Soil Physics of UFRPE, where it held its preparation through air drying, lump breaking and classification in sieve to then submit it to the adsorption process. Upon completion of 22 complete factorial design it was concluded that the values of (96, 96,5 and 95,8%) corresponding to the percentage of of removal for "in-kind", chemically and thermally activated, respectively and adsorbed amounts of (4,80, 4,61 and 4,74 mg/g) for three clays. Showed that the activation processes used did not increase the adsorption capacity of smectite clay. The kinetic data were best fitted to the Freundlich isotherm, with an exponential distribution of active sites and that shows above the Langmuir equation for adsorption of cations and anions by clays. The kinetic model that best adapted to the results was the pseudosecond order model. In the factorial design study 24-1, at concentrations up to 500 mg/L obtains high percentage of color removal (92,37, 90,92 and 93,40%) and adsorbed amount (230,94, 227,31 and 233,50 mg/g) for three clays. The kinetic data fitted well to Langmuir and Freundlich isotherms. The kinetic model that best adapted to the results was the pseudosecond order model
Resumo:
The present work has as objective the knowledge of the process of drying of the cephalothorax of shrimp to give support the industry to make possible the use of this byproduct. In this sense, the process conditions in this tray dryer and spouted bed were analyzed. With these results, it was projected and constructs a dryer with specific characteristics for the drying of the cephalothorax. The desorption isotherms were obtained by the dynamic method in the temperatures of 20, 35 and 50º C and in the interval of 10-90% of relative humidity. It was observed that the product in form of powder can be conserved with larger stability for lower relative humidity to 40%. The curves of drying of the dryer of fixed bed were adjusted for the models: single exponential, biparametric exponential and Page. The model biparametric exponential more adequately described all the drying conditions studied. The tests carry out in spouted bed showed high drying rate for the material in the paste form in beds active dynamicly-fluid, provely the necessity of a feeding in shorter intervals of time to increase the thermal efficiency of the process. The projected dryer, be considered the obtained results, it was a rotary dryer with inert bed, feed co-current, discharge in cyclone to take place the separation gas-solid, and feed carry out in intervals of 2 minutes. The optimization of the equipment projected it was accomplished used the complete factorial experimental design 24, this had as independent variables temperature velocity of the air, feed flow rate and encapsulated concentration (albumin), as variables answers the thermal efficiency, the moisture content of obtained powder, total time of test and the efficiency of production of powder in several points of processing. The results showed that the rotary dryer with inert bed can present, also, good results if applied industrially
Resumo:
The exponential figure of Gregório de Matos e Guerra has been subject of many theoretical discussions through the years, since his apparition in a public place, in the 19th century, and even more, during the 20th century, when he was salvaged by the modernist vanguard. As a result, there are yet two antagonist points of view linked to Gregório de Matos, on one side, there some researchers who defend him, on the other, some of them attack him. The first ones say this poet from Bahia was the first literary voice in Brazil, from the Baroque basis, while the last ones say he is a merely plagiarist of the Spanish poets from the 17th century, without a real contribution to the development of Brazilian Literature. With this in mind, this thesis follows the perspective this poet is an anthropophagus-baroque, devouring cultures, with an active participation in the process of our cultural and literary identity. For that reason, it was made a literature review about the biography of this poet trying to break romantic descriptions, emphasizing some scientific facts that can contribute to present the baroque profile of this poet. In this sense, it was discussed the History of Literature focused on this creole poet, mainly based on the historians point of view about the Gregorian poetry in the formation of Brazilian Literature scenery. In the defense of the hypothesis that Gregório de Matos was our first anthropophagus, this work aims to analyze how his poetry reveals the intrinsic characteristics of Baroque and Anthropophagy, focusing its carnivalesque aspect, showing to the world, with a satiric tone, the idiosyncrasies of human life. In this way, analyzing this corpus in Spanish is the strength of this thesis because, besides it is previously unpublished, it contributes to the comprehension of the anthropophagy as a theoretical mechanism that explains the process of formation of our cultural literary identity. Then, we have Augusto de Campos (1968; 1978; 1984; 1986; 1988), Haroldo de Campos (1976; 2010a; 2010b; 2011), Severo Sarduy ([1988?]), Oswald de Andrade (1945; 1978; 2006), Mikhail Bakhtin (2010), Octavio Paz (1979), Segismundo Spina (1980; 1995; 2008), Afrânio Coutinho (1986a; 1986b; 1994), Affonso Ávila (1994; 1997; 2004; 2008), among others, to constitute this theoretical scenery. The Gregorian poetry, in this way, have contributed to the formation of baroque-anthropophagic scenery in Brazilian boundaries, with a special attention to the transition of time, because he is not only from the 17th century, established by the historiography, but his work is present nowadays due to the contemporaneously of his themes, centered to the eternal doubts of baroque man
Resumo:
In the 20th century, the acupuncture has spread on occident as a complementary practice of heath care. This fact has motivated the international scientific community to invest in research that seek to understand why acupuncture works. In this work we compare statistically volt age fluctuation of bioelectric signals caught on the skin at an acupuncture point (IG 4) another nearby on acupuncture point. The acquisition of these signals was performed utilizing an electronic interface with a computer, which was based on an instrumentation amplifier designed with adequate specifications to this end. On the collected signals from a sample of 30 volunteers we have calculated major statistics and submitted them to pairing t-test with significance leveI a = O, 05. We have estimated to bioelectric signals the following parameters: standard deviation, asymmetry and curtose. Moreover, we have calculated the self-correlation function matched by on exponential curve we have observed that the signal decays more rapidly from a non-acupoint then from an acupoint. This fact is an indicative of the existence of information in the acupoint