73 resultados para ESTIMATIVA DE TEMPO

em Universidade Federal do Rio Grande do Norte(UFRN)


Relevância:

70.00% 70.00%

Publicador:

Resumo:

The time perception is critical for environmental adaptation in humans and other species. The temporal processing, has evolved through different neural systems, each responsible for processing different time scales. Among the most studied scales is that spans the arrangement of seconds to minutes. Evidence suggests that the dorsolateral prefrontal (DLPFC) cortex has relationship with the time perception scale of seconds. However, it is unclear whether the deficit of time perception in patients with brain injuries or even "reversible lesions" caused by transcranial magnetic stimulation (TMS) in this region, whether by disruption of other cognitive processes (such as attention and working memory) or the time perception itself. Studies also link the region of DLPFC in emotional regulation and specifically the judgment and emotional anticipation. Given this, our objective was to study the role of the dorsolateral prefrontal cortex in the time perception intervals of active and emotionally neutral stimuli, from the effects of cortical modulation by transcranial direct current stimulation (tDCS), through the cortical excitation (anodic current), inhibition (cathode current) and control (sham) using the ranges of 4 and 8 seconds. Our results showed that there is an underestimation when the picture was presented by 8 seconds, with the anodic current in the right DLPFC, there is an underestimation and with cathodic current in the left DLPFC, there is an overestimation of the time reproduction with neutral ones. The cathodic current over the left DLPFC leads to an inverse effect of neutral ones, an underestimation of time with negative pictures. Positive or negative pictures improved estimates for 8 second and positive pictures inhibited the effect of tDCS in DLPFC in estimating time to 4 seconds. With this work, we conclude that the DLPFC plays a key role in the o time perception and largely corresponds to the stages of memory and decision on the internal clock model. The left hemisphere participates in the perception of time in both active and emotionally neutral contexts, and we can conclude that the ETCC and an effective method to study the cortical functions in the time perception in terms of cause and effect.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The considered work presents the procedure for evaluation of the uncertainty related to the calibration of flow measurers and to BS&W. It is about a new method of measurement purposed by the conceptual project of the laboratory LAMP, at Universidade Federal do Rio Grande do Norte, that intends to determine the conventional true value of the BS&W from the total height of the liquid column in the auditor tank, hydrostatic pressure exerted by the liquid column, local gravity, specific mass of the water and the specific mass of the oil, and, to determine the flow, from total height of liquid column and transfer time. The calibration uses a automatized system of monitoration and data acquisition of some necessary largnesses to determine of flow and BS&W, allowing a better trustworthiness of through measurements

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work, we propose a two-stage algorithm for real-time fault detection and identification of industrial plants. Our proposal is based on the analysis of selected features using recursive density estimation and a new evolving classifier algorithm. More specifically, the proposed approach for the detection stage is based on the concept of density in the data space, which is not the same as probability density function, but is a very useful measure for abnormality/outliers detection. This density can be expressed by a Cauchy function and can be calculated recursively, which makes it memory and computational power efficient and, therefore, suitable for on-line applications. The identification/diagnosis stage is based on a self-developing (evolving) fuzzy rule-based classifier system proposed in this work, called AutoClass. An important property of AutoClass is that it can start learning from scratch". Not only do the fuzzy rules not need to be prespecified, but neither do the number of classes for AutoClass (the number may grow, with new class labels being added by the on-line learning process), in a fully unsupervised manner. In the event that an initial rule base exists, AutoClass can evolve/develop it further based on the newly arrived faulty state data. In order to validate our proposal, we present experimental results from a level control didactic process, where control and error signals are used as features for the fault detection and identification systems, but the approach is generic and the number of features can be significant due to the computationally lean methodology, since covariance or more complex calculations, as well as storage of old data, are not required. The obtained results are significantly better than the traditional approaches used for comparison

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The gravity inversion method is a mathematic process that can be used to estimate the basement relief of a sedimentary basin. However, the inverse problem in potential-field methods has neither a unique nor a stable solution, so additional information (other than gravity measurements) must be supplied by the interpreter to transform this problem into a well-posed one. This dissertation presents the application of a gravity inversion method to estimate the basement relief of the onshore Potiguar Basin. The density contrast between sediments and basament is assumed to be known and constant. The proposed methodology consists of discretizing the sedimentary layer into a grid of rectangular juxtaposed prisms whose thicknesses correspond to the depth to basement which is the parameter to be estimated. To stabilize the inversion I introduce constraints in accordance with the known geologic information. The method minimizes an objective function of the model that requires not only the model to be smooth and close to the seismic-derived model, which is used as a reference model, but also to honor well-log constraints. The latter are introduced through the use of logarithmic barrier terms in the objective function. The inversion process was applied in order to simulate different phases during the exploration development of a basin. The methodology consisted in applying the gravity inversion in distinct scenarios: the first one used only gravity data and a plain reference model; the second scenario was divided in two cases, we incorporated either borehole logs information or seismic model into the process. Finally I incorporated the basement depth generated by seismic interpretation into the inversion as a reference model and imposed depth constraint from boreholes using the primal logarithmic barrier method. As a result, the estimation of the basement relief in every scenario has satisfactorily reproduced the basin framework, and the incorporation of the constraints led to improve depth basement definition. The joint use of surface gravity data, seismic imaging and borehole logging information makes the process more robust and allows an improvement in the estimate, providing a result closer to the actual basement relief. In addition, I would like to remark that the result obtained in the first scenario already has provided a very coherent basement relief when compared to the known basin framework. This is significant information, when comparing the differences in the costs and environment impact related to gravimetric and seismic surveys and also the well drillings

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PEDRINI, Aldomar; SZOKOLAY, Steven. Recomendações para o desenvolvimento de uma ferramenta de suporte às primeiras decisões projetuais visando ao desempenho energético de edificações de escritório em clima quente. Ambiente Construído, Porto Alegre, v. 5, n. 1, p.39-54, jan./mar. 2005. Trimestral. Disponível em: . Acesso em: 04 out. 2010.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BONACCINI, J. A. . Sobre o tempo. Princípios: revista de filosofia. Natal((RN), v. 5, n. 6, p. 123-138, 1998. ISSN 1983-2109. Disponivel em: . Acesso em: 04 out. 2010.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A duração das operações pode representar um fator importante para uma série de complicações pós-operatórias, especialmente para os indivíduos idosos. Objetivo: estudar a repercussão nos pulmões, de operações de diferentes tempos de duração. Métodos: Vinte ratos idosos (18 meses) e 20 jovens (3 meses) foram separados aleatoriamente em grupos A e B respectivamente. Os grupos foram divididos em A1, A2, A3, A4, B1, B2, B3 and B4, com cinco ratos cada. Os animais foram anestesiados com pentobarbital (20mg/Kg) intraperitoneal. No subgrupo A1 e B1 foi feita operação com duração de 30 minutos, nos grupos A2 and B2 60 minutos, em A3 and B3 a operação foi feita em 120 minuto e os animais A4 e B4 (controle) não foram operados. O procedimento consistiu de laparotomia xifopubiana que foi aberta e fechada tantas vezes quanto necessário para atingir os tempos estipulados. Após o quinto dia pós-operatório os animais foram mortos com superdose de anestésico e biópsias de ambos os pulmões foram realizadas. Os achados histopatológicos foram transformados em escores. Resultados: os grupos de ratos jovens atingiram os escores: A1= escore 6, A2=11; A3=28; A4=5. Os ratos idosos tiveram os escores: B1=12; B2=34; B3=51 e B4=6. A análise estatística revelou diferenças significantes entre os escores dos grupos A e B. Conclusões: O tempo prolongado nas operações realizadas em ratos idosos contribuiu para o aparecimento de alterações pulmonares de modo significante. Quanto maior o tempo operatório, mais intensas e mais freqüentes as complicações pulmonares

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A duração das operações pode representar um fator importante para uma série de complicações pós-operatórias, especialmente para os indivíduos idosos. Objetivo: estudar a repercussão nos pulmões, de operações de diferentes tempos de duração. Métodos: Vinte ratos idosos (18 meses) e 20 jovens (3 meses) foram separados aleatoriamente em grupos A e B respectivamente. Os grupos foram divididos em A1, A2, A3, A4, B1, B2, B3 and B4, com cinco ratos cada. Os animais foram anestesiados com pentobarbital (20mg/Kg) intraperitoneal. No subgrupo A1 e B1 foi feita operação com duração de 30 minutos, nos grupos A2 and B2 60 minutos, em A3 and B3 a operação foi feita em 120 minuto e os animais A4 e B4 (controle) não foram operados. O procedimento consistiu de laparotomia xifopubiana que foi aberta e fechada tantas vezes quanto necessário para atingir os tempos estipulados. Após o quinto dia pós-operatório os animais foram mortos com superdose de anestésico e biópsias de ambos os pulmões foram realizadas. Os achados histopatológicos foram transformados em escores. Resultados: os grupos de ratos jovens atingiram os escores: A1= escore 6, A2=11; A3=28; A4=5. Os ratos idosos tiveram os escores: B1=12; B2=34; B3=51 e B4=6. A análise estatística revelou diferenças significantes entre os escores dos grupos A e B. Conclusões: O tempo prolongado nas operações realizadas em ratos idosos contribuiu para o aparecimento de alterações pulmonares de modo significante. Quanto maior o tempo operatório, mais intensas e mais freqüentes as complicações pulmonares

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Forecast is the basis for making strategic, tactical and operational business decisions. In financial economics, several techniques have been used to predict the behavior of assets over the past decades.Thus, there are several methods to assist in the task of time series forecasting, however, conventional modeling techniques such as statistical models and those based on theoretical mathematical models have produced unsatisfactory predictions, increasing the number of studies in more advanced methods of prediction. Among these, the Artificial Neural Networks (ANN) are a relatively new and promising method for predicting business that shows a technique that has caused much interest in the financial environment and has been used successfully in a wide variety of financial modeling systems applications, in many cases proving its superiority over the statistical models ARIMA-GARCH. In this context, this study aimed to examine whether the ANNs are a more appropriate method for predicting the behavior of Indices in Capital Markets than the traditional methods of time series analysis. For this purpose we developed an quantitative study, from financial economic indices, and developed two models of RNA-type feedfoward supervised learning, whose structures consisted of 20 data in the input layer, 90 neurons in one hidden layer and one given as the output layer (Ibovespa). These models used backpropagation, an input activation function based on the tangent sigmoid and a linear output function. Since the aim of analyzing the adherence of the Method of Artificial Neural Networks to carry out predictions of the Ibovespa, we chose to perform this analysis by comparing results between this and Time Series Predictive Model GARCH, developing a GARCH model (1.1).Once applied both methods (ANN and GARCH) we conducted the results' analysis by comparing the results of the forecast with the historical data and by studying the forecast errors by the MSE, RMSE, MAE, Standard Deviation, the Theil's U and forecasting encompassing tests. It was found that the models developed by means of ANNs had lower MSE, RMSE and MAE than the GARCH (1,1) model and Theil U test indicated that the three models have smaller errors than those of a naïve forecast. Although the ANN based on returns have lower precision indicator values than those of ANN based on prices, the forecast encompassing test rejected the hypothesis that this model is better than that, indicating that the ANN models have a similar level of accuracy . It was concluded that for the data series studied the ANN models show a more appropriate Ibovespa forecasting than the traditional models of time series, represented by the GARCH model

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the last years, heparin has become target of many studies related to inflammation due its ability of biding to proteins involved on immune response. Recently, it was demonstrated, at our laboratory, using a thIoglycollate-induced peritonitis model, heparin s capacity of reduce cellular influx into the peritoneal cavity, 3 hours after the inflammatory stimulus. Once neutrophilic infiltration is highest around 8 hours after the inflammatory stimulus, at the present work, using the same peritonitis model, it was assessed heparin s ability of keeping the interference on leukocyte infiltration, 8 hours after inflammation induction. Moreover, using cellular differential count, it was evaluated how the cellular populations involved in the inflammatory process would be affected by the treatment. Eight hours after the inflammatory stimulus, only heparin dosage of 1 μg/Kg was able to reduce the cellular influx to peritoneum, 62.8% of reduction when compared to positive control (p < 0.001). Furthermore, heparin dosage of 15 μg/Kg presented a pro-inflammatory effect in whole blood verified by the increase of 60.9% (p < 0.001) and 117.8% (p < 0.001) on neutrophils and monocytes proportion, respectively, when compared to positive control. In addition, this dosage also presented a neutrophilic proportion on peritoneal fluid 27.3% higher than positive control (p < 0.05). This duality between anti- and pro-inflammatory effects at different times corroborates studies that attribute a pleiotropic immunomodulator role to heparin.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The new oil reservoirs discoveries in onshore and ultra deep water offshore fields and complex trajectories require the optimization of procedures to reduce the stops operation during the well drilling, especially because the platforms and equipment high cost, and risks which are inherent to the operation. Among the most important aspects stands out the drilling fluids project and their behavior against different situations that may occur during the process. By means of sedimentation experiments, a correlation has been validated to determe the sedimentation particles velocity in variable viscosity fluids over time, applying the correction due to effective viscosity that is a shear rate and time function. The viscosity evolution over time was obtained by carrying out rheologic tests using a fixed shear rate, small enough to not interfere in the fluid gelling process. With the sedimentation particles velocity and the fluid viscosity over time equations an iterative procedure was proposed to determine the particles displacement over time. These equations were implemented in a case study to simulate the cuttings sedimentation generated in the oil well drilling during stops operation, especially in the connections and tripping, allowing the drilling fluid project in order to maintain the cuttings in suspension, avoiding risks, such as stuck pipe and in more drastic conditions, the loss of the well

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The use of infrared burners in industrial applications has many advantages in terms of technical-operational, for example, uniformity in the heat supply in the form of radiation and convection, with greater control of emissions due to the passage of exhaust gases through a macro-porous ceramic bed. This paper presents an infrared burner commercial, which was adapted an experimental ejector, capable of promoting a mixture of liquefied petroleum gas (LPG) and glycerin. By varying the percentage of dual-fuel, it was evaluated the performance of the infrared burner by performing an energy balance and atmospheric emissions. It was introduced a temperature controller with thermocouple modulating two-stage (low heat / high heat), using solenoid valves for each fuel. The infrared burner has been tested and tests by varying the amount of glycerin inserted by a gravity feed system. The method of thermodynamic analysis to estimate the load was used an aluminum plate located at the exit of combustion gases and the distribution of temperatures measured by a data acquisition system which recorded real-time measurements of the thermocouples attached. The burner had a stable combustion at levels of 15, 20 and 25% of adding glycerin in mass ratio of LPG gas, increasing the supply of heat to the plate. According to data obtained showed that there was an improvement in the efficiency of the 1st Law of infrared burner with increasing addition of glycerin. The emission levels of greenhouse gases produced by combustion (CO, NOx, SO2 and HC) met the environmental limits set by resolution No. 382/2006 of CONAMA

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The development of oil wells drilling requires additional cares mainly if the drilling is in offshore ultra deep water with low overburden pressure gradients which cause low fracture gradients and, consequently, difficult the well drilling by the reduction of the operational window. To minimize, in the well planning phases, the difficulties faced by the drilling in those sceneries, indirect models are used to estimate fracture gradient that foresees approximate values for leakoff tests. These models generate curves of geopressures that allow detailed analysis of the pressure behavior for the whole well. Most of these models are based on the Terzaghi equation, just differentiating in the determination of the values of rock tension coefficient. This work proposes an alternative method for prediction of fracture pressure gradient based on a geometric correlation that relates the pressure gradients proportionally for a given depth and extrapolates it for the whole well depth, meaning that theses parameters vary in a fixed proportion. The model is based on the application of analytical proportion segments corresponding to the differential pressure related to the rock tension. The study shows that the proposed analytical proportion segments reaches values of fracture gradient with good agreement with those available for leakoff tests in the field area. The obtained results were compared with twelve different indirect models for fracture pressure gradient prediction based on the compacting effect. For this, a software was developed using Matlab language. The comparison was also made varying the water depth from zero (onshore wellbores) to 1500 meters. The leakoff tests are also used to compare the different methods including the one proposed in this work. The presented work gives good results for error analysis compared to other methods and, due to its simplicity, justify its possible application