26 resultados para INFERÊNCIA ESTATÍSTICA
Resumo:
Nowadays, where the market competition requires products with better quality and a constant search for cost savings and a better use of raw materials, the research for more efficient control strategies becomes vital. In Natural Gas Processin Units (NGPUs), as in the most chemical processes, the quality control is accomplished through their products composition. However, the chemical composition analysis has a long measurement time, even when performed by instruments such as gas chromatographs. This fact hinders the development of control strategies to provide a better process yield. The natural gas processing is one of the most important activities in the petroleum industry. The main economic product of a NGPU is the liquefied petroleum gas (LPG). The LPG is ideally composed by propane and butane, however, in practice, its composition has some contaminants, such as ethane and pentane. In this work is proposed an inferential system using neural networks to estimate the ethane and pentane mole fractions in LPG and the propane mole fraction in residual gas. The goal is to provide the values of these estimated variables in every minute using a single multilayer neural network, making it possibly to apply inferential control techniques in order to monitor the LPG quality and to reduce the propane loss in the process. To develop this work a NGPU was simulated in HYSYS R software, composed by two distillation collumns: deethanizer and debutanizer. The inference is performed through the process variables of the PID controllers present in the instrumentation of these columns. To reduce the complexity of the inferential neural network is used the statistical technique of principal component analysis to decrease the number of network inputs, thus forming a hybrid inferential system. It is also proposed in this work a simple strategy to correct the inferential system in real-time, based on measurements of the chromatographs which may exist in process under study
Resumo:
Wireless sensor networks (WSN) have gained ground in the industrial environment, due to the possibility of connecting points of information that were inaccessible to wired networks. However, there are several challenges in the implementation and acceptance of this technology in the industrial environment, one of them the guaranteed availability of information, which can be influenced by various parameters, such as path stability and power consumption of the field device. As such, in this work was developed a tool to evaluate and infer parameters of wireless industrial networks based on the WirelessHART and ISA 100.11a protocols. The tool allows quantitative evaluation, qualitative evaluation and evaluation by inference during a given time of the operating network. The quantitative and qualitative evaluation are based on own definitions of parameters, such as the parameter of stability, or based on descriptive statistics, such as mean, standard deviation and box plots. In the evaluation by inference uses the intelligent technique artificial neural networks to infer some network parameters such as battery life. Finally, it displays the results of use the tool in different scenarios networks, as topologies star and mesh, in order to attest to the importance of tool in evaluation of the behavior of these networks, but also support possible changes or maintenance of the system.
Resumo:
This work aims to obtain a low-cost virtual sensor to estimate the quality of LPG. For the acquisition of data from a distillation tower, software HYSYS ® was used to simulate chemical processes. These data will be used for training and validation of an Artificial Neural Network (ANN). This network will aim to estimate from available simulated variables such as temperature, pressure and discharge flow of a distillation tower, the mole fraction of pentane present in LPG. Thus, allowing a better control of product quality
Resumo:
In this work we analyze the skin bioimpedance statistical distribution. We focus on the study of two distinct samples: the statistics of impedance of several points in the skin of a single individual and the statistics over a population (many individuals) but in a single skin point. The impedance data was obtained from the literature (Pearson, 2007). Using the Shapiro-Wilk test and the assymmetry test we conclude that the impedance of a population is better described by an assymetric and non-normal distribution. On the other side, the data concerning the individual impedance seems to follow a normal distribution. We have performed a goodnes of fitting test and the better distribution to fit the data of a population is the log-normal distribution. It is interesting to note that our result for skin impedance is in simtony with body impedance from the literature of electrical engeneering. Our results have an impact over the statistical planning and modelling of skin impedance experiments. Special attention we should drive to the treatment of outliers in this kind of dataset. The results of this work are important in the general discussion of low impedance of points of acupuncture and also in the problem of skin biopotentials used in equipments like the Electrodermal Screen Tests.
Resumo:
Trasnversal study, with the objective of evaluating the accuracy of clinical indicators of nursing diagnosis excessive fluid volume in patients undergoing hemodialysis. The study occurred in two stages, the first consisted of the evaluation of the diagnostic indicators in study; and the second, the diagnostic inference conducted by nurse diagnosticians. The first stage occurred from december 2012 to april 2013, in a University Hospital and a Hemodialysis Clinic in Northeastern of Brazil, with a sample of 100 chronic renal failure patients on hemodialysis. The data were selected through an interview form and a physical examination, organized into spreadsheets and analyzed as to the presence or absence of the indicators of diagnosis excessive fluid volume. In the second step, the spreadsheets were sent to three nurses diagnosticians, who judged the presence or absence of diagnosis in the clientele searched. This step was conducted from july to september 2013. For analysis of the data, we used descriptive and inferential statistics. In the descriptive analysis, we used measures of central tendency and dispersion. In inferential analysis, we used the tests Chi- square, Fisher and prevalence ratios. The accuracy of the clinical indicators pertaining to the diagnosis were measured as to the specificity, sensitivity, predictive values, likelihood ratios and Diagnostic Odds Ratio. Also developed a logistic regression. The results were organized in tables and discussed with literature. This study was approved by the Ethics Committee in Research of the Federal University of Rio Grande do Norte, with Presentation Certificate for Ethics Appreciation nº 08696212.7.0000.5537. The results revealed that the diagnosis studied was present in 82% of patients. The characteristics with prevalence above 50 % that stood out were: azotemia, decreased hematocrit, electrolyte imbalance, intake exceeds output, anxiety, edema, decreased hemoglobin, oliguria and blood pressure changes. Eight defining characteristics were presented statistically significant association with the nursing diagnosis investigated: pulmonary congestion, intake exceeds output, electrolytes imbalance, jugular vein distension, edema, weight gain over short period of time, agitation and adventitious breath sounds. Among these, the 10 characteristics which showed higher prevalence ratios were: edema and weight gain over short period of time. The features with the highest sensitivity were edema, electrolytes imbalance and intake exceeds output and the standing out with greater specificity were: anasarca, weight gain over short period of time, change in respiratory pattern, adventitious breath sounds, pulmonary congestion, agitation and jugular vein distension. The indicators jugular vein distension, electrolytes imbalance, intake exceeds output, increased central venous pressure and edema, together, were identified in the logistic regression model as the most significant predictors. It is concluded that the identification of accurate clinical indicators allow a good prediction of the nursing diagnosis of excessive fluid volume in patients undergoing hemodialysis in order to assist the nurse in the inference process, which will contribute to the success of patient care. In addition, nurses will consider for diagnostic inference not only his clinical experience, but also scientific evidence of the occurrence of excessive fluid volume, contributing to the control of volemia in these patients
Resumo:
The problems of combinatory optimization have involved a large number of researchers in search of approximative solutions for them, since it is generally accepted that they are unsolvable in polynomial time. Initially, these solutions were focused on heuristics. Currently, metaheuristics are used more for this task, especially those based on evolutionary algorithms. The two main contributions of this work are: the creation of what is called an -Operon- heuristic, for the construction of the information chains necessary for the implementation of transgenetic (evolutionary) algorithms, mainly using statistical methodology - the Cluster Analysis and the Principal Component Analysis; and the utilization of statistical analyses that are adequate for the evaluation of the performance of the algorithms that are developed to solve these problems. The aim of the Operon is to construct good quality dynamic information chains to promote an -intelligent- search in the space of solutions. The Traveling Salesman Problem (TSP) is intended for applications based on a transgenetic algorithmic known as ProtoG. A strategy is also proposed for the renovation of part of the chromosome population indicated by adopting a minimum limit in the coefficient of variation of the adequation function of the individuals, with calculations based on the population. Statistical methodology is used for the evaluation of the performance of four algorithms, as follows: the proposed ProtoG, two memetic algorithms and a Simulated Annealing algorithm. Three performance analyses of these algorithms are proposed. The first is accomplished through the Logistic Regression, based on the probability of finding an optimal solution for a TSP instance by the algorithm being tested. The second is accomplished through Survival Analysis, based on a probability of the time observed for its execution until an optimal solution is achieved. The third is accomplished by means of a non-parametric Analysis of Variance, considering the Percent Error of the Solution (PES) obtained by the percentage in which the solution found exceeds the best solution available in the literature. Six experiments have been conducted applied to sixty-one instances of Euclidean TSP with sizes of up to 1,655 cities. The first two experiments deal with the adjustments of four parameters used in the ProtoG algorithm in an attempt to improve its performance. The last four have been undertaken to evaluate the performance of the ProtoG in comparison to the three algorithms adopted. For these sixty-one instances, it has been concluded on the grounds of statistical tests that there is evidence that the ProtoG performs better than these three algorithms in fifty instances. In addition, for the thirty-six instances considered in the last three trials in which the performance of the algorithms was evaluated through PES, it was observed that the PES average obtained with the ProtoG was less than 1% in almost half of these instances, having reached the greatest average for one instance of 1,173 cities, with an PES average equal to 3.52%. Therefore, the ProtoG can be considered a competitive algorithm for solving the TSP, since it is not rare in the literature find PESs averages greater than 10% to be reported for instances of this size.
Resumo:
The vehicles are the main mobile sources of carbon monoxide (CO) and unburned hydrocarbons (HC) released into the atmosphere. In the last years the increment of the fleet of vehicles in the municipal district of Natal-RN it is contributing to the increase of the emissions of those pollutants. The study consisted of a statistical analysis of the emissions of CO and HC of a composed sample for 384 vehicles with mechanization Gasoline/CNG or Alcohol/Gasoline/CNG of the municipal district of Natal-RN. The tests were accomplished in vehicles submitted to Vehicular Safety's Inspection, in the facilities of INSPETRANS, Organism of Vehicular Inspection. An partial gases analyzer allowed to measure, for each vehicle, the levels of CO and HC in two conditions of rotation of the motor (900 and 2500 rpm). The statistical analysis accomplished through the STATISTICA software revealed a sensitive reduction in the efficiency of the converters catalytic after 6 years of use with emission average it is of 0,78% of CO and 156 (ppm) of HC, Which represents approximately 4 (four) times the amount of CO and the double of HC in comparison with the newest vehicles. The result of a Student s t-test, suggests strongly that the average of the emissions of HC (152 ppm), at 900 rpm, is 40% larger than at 2500 rpm, for the motor without load. This result reveals that the efficiency of the catalytic conversion is limited kinetically in low engine speeds. The Study also ends that when comparing the emissions of CO and HC considering the influence of the fuels, it was verified that although the emissions of CO starting from CNG are 62% smaller than arising from the gasoline, there are not significant differences among the emissions of HC originating from of CNG and of the gasoline. In synthesis, the results place the current criteria of vehicular inspection, for exhaust gases, in doubt, leading the creation of emission limits of pollutant more rigorous, because the efficiency of the converters catalytic is sensibly reduced starting from 6 years of use. It is also raised the possibility of modifications in the test conditions adopted by the current norms, specifically in the speed engine, have seen that in the condition without load the largest emission indexes were registered in slow march. That fact that allows to suggest the dismissal of the tests in high speed engine, reducing the time of inspection in half and generating economy of fuel
Resumo:
This dissertation aims to assess the representativeness of the manual chilled mirror analyzer (model II Chanscope 13-1200-CN-2) used for the determination of condensed hydrocarbons of natural gas compared to the indirect methods, based on thermodynamic models equation of state. Additionally, it has been implemented in this study a model for calculating the dew point of natural gas. The proposed model is a modification of the equation of state of Peng-Robinson admits that the groups contribution as a strategy to calculate the binary interaction parameters kij (T) temperature dependence. Experimental data of the work of Brown et al. (2007) were used to compare the responses of the dew point of natural gas with thermodynamic models contained in the UniSim process simulator and the methodology implemented in this study. Then two natural gas compositions were studied, the first being a standard gas mixture gravimetrically synthesized and, second, a mixture of processed natural gas. These experimental data were also compared with the results presented by UniSim process simulator and the thermodynamic model implemented. However, data from the manual analysis results indicated significant differences in temperature, these differences were attributed to the formation of dew point of water, as we observed the appearance of moisture on the mirror surface cooling equipment
Resumo:
Starting from the idea that the result of the Humean analysis of causal inferences must be applied coherently to the remaining part of his work, including its moral theory, the present master thesis aims at investigating whether Hume´s moral philosophy is essentially based on feeling, or whether this would not be rather essentially a consequence of our causal inferences in human actions and deliberations. The main idea consists in showing that our moral inferences, to the extent that they are for Hume empirical , depend on our belief in a connexion between something which has been previously observed and something which is not being observed ( but that it is expected to occur or to be observed in the future). Thus, this very belief must base our moral inferences concerning the actions and deliberations of the individuals. Therefore, must e o ipso induce us to associate actions and behaviors, as well as character and moral claims of men to certain moral feelings. Accordingly, the thesis is unfolded in three chapters. In the first chapter Hume´s theory of the perception is reported as essential part of the explanation or the principles that bind ideas in our mind and constitute our inferences. In the second chapter, the Humean analysis of causal inferences is presented and the way they contribute in the formation of our moral inferences is explained. In the third and last chapter, the formation of our moral inferences and the real contribution of the doctrine of freedom and necessity for the examination or our actions are analysed and discussed.
Resumo:
In the 20th century, the acupuncture has spread on occident as a complementary practice of heath care. This fact has motivated the international scientific community to invest in research that seek to understand why acupuncture works. In this work we compare statistically volt age fluctuation of bioelectric signals caught on the skin at an acupuncture point (IG 4) another nearby on acupuncture point. The acquisition of these signals was performed utilizing an electronic interface with a computer, which was based on an instrumentation amplifier designed with adequate specifications to this end. On the collected signals from a sample of 30 volunteers we have calculated major statistics and submitted them to pairing t-test with significance leveI a = O, 05. We have estimated to bioelectric signals the following parameters: standard deviation, asymmetry and curtose. Moreover, we have calculated the self-correlation function matched by on exponential curve we have observed that the signal decays more rapidly from a non-acupoint then from an acupoint. This fact is an indicative of the existence of information in the acupoint
Resumo:
O processamento de registros sísmicos é uma tarefa muito importante dentro da Geofísica e que representa um desafio permanente na exploração de petróleo. Embora esses sinais forneçam uma imagem adequada da estrutura geológica do subsolo, eles são contaminados por ruídos e, o ground roll é a componente principal. Este fato exige um esforço grande para o desenvolvimento de metodologias para filtragem, Dentro desse contexto, este trabalho tem como objetivo apresentar um método de remoção do ruído ground roll fazendo uso de ferramentas da Física Estatística. No método, a Análise em Ondeletas é combinada com a Transformada de Karhunen-Loève para a remoção em uma região bem localizada. O processo de filtragem começa com a Decomposição em Multiescala. Essa técnica permite uma representação em tempo-escala fazendo uso das ondeletas discretas implementadas a filtros de reconstrução perfeita. O padrão sísmico original fica representado em multipadrões: um por escala. Assim, pode-se atenuar o ground roll como uma operação cirúrgica em cada escala, somente na região onde sua presença é forte, permitindo preservar o máximo de informações relevantes. A atenuação é realizada pela definição de um fator de atenuação Af. Sua escolha é feita pelo comportamento dos modos de energia da Transformada de Karhunen-Loève. O ponto correspondendo a um mínimo de energia do primeiro modo é identificado como um fator de atenuação ótimo
Resumo:
Systems whose spectra are fractals or multifractals have received a lot of attention in recent years. The complete understanding of the behavior of many physical properties of these systems is still far from being complete because of the complexity of such systems. Thus, new applications and new methods of study of their spectra have been proposed and consequently a light has been thrown on their properties, enabling a better understanding of these systems. We present in this work initially the basic and necessary theoretical framework regarding the calculation of energy spectrum of elementary excitations in some systems, especially in quasiperiodic ones. Later we show, by using the Schr¨odinger equation in tight-binding approximation, the results for the specific heat of electrons within the statistical mechanics of Boltzmann-Gibbs for one-dimensional quasiperiodic systems, growth by following the Fibonacci and Double Period rules. Structures of this type have already been exploited enough, however the use of non-extensive statistical mechanics proposed by Constantino Tsallis is well suited to systems that have a fractal profile, and therefore our main objective was to apply it to the calculation of thermodynamical quantities, by extending a little more the understanding of the properties of these systems. Accordingly, we calculate, analytical and numerically, the generalized specific heat of electrons in one-dimensional quasiperiodic systems (quasicrystals) generated by the Fibonacci and Double Period sequences. The electronic spectra were obtained by solving the Schr¨odinger equation in the tight-binding approach. Numerical results are presented for the two types of systems with different values of the parameter of nonextensivity q
Resumo:
In this work we study a connection between a non-Gaussian statistics, the Kaniadakis
statistics, and Complex Networks. We show that the degree distribution P(k)of
a scale free-network, can be calculated using a maximization of information entropy in
the context of non-gaussian statistics. As an example, a numerical analysis based on the
preferential attachment growth model is discussed, as well as a numerical behavior of
the Kaniadakis and Tsallis degree distribution is compared. We also analyze the diffusive
epidemic process (DEP) on a regular lattice one-dimensional. The model is composed
of A (healthy) and B (sick) species that independently diffusive on lattice with diffusion
rates DA and DB for which the probabilistic dynamical rule A + B → 2B and B → A. This
model belongs to the category of non-equilibrium systems with an absorbing state and a
phase transition between active an inactive states. We investigate the critical behavior of
the DEP using an auto-adaptive algorithm to find critical points: the method of automatic
searching for critical points (MASCP). We compare our results with the literature and we
find that the MASCP successfully finds the critical exponents 1/ѵ and 1/zѵ in all the cases
DA =DB, DA
Resumo:
This dissertation briefly presents the random graphs and the main quantities calculated from them. At the same time, basic thermodynamics quantities such as energy and temperature are associated with some of their characteristics. Approaches commonly used in Statistical Mechanics are employed and rules that describe a time evolution for the graphs are proposed in order to study their ergodicity and a possible thermal equilibrium between them
Resumo:
The work is to make a brief discussion of methods to estimate the parameters of the Generalized Pareto distribution (GPD). Being addressed the following techniques: Moments (moments), Maximum Likelihood (MLE), Biased Probability Weighted Moments (PWMB), Unbiased Probability Weighted Moments (PWMU), Mean Power Density Divergence (MDPD), Median (MED), Pickands (PICKANDS), Maximum Penalized Likelihood (MPLE), Maximum Goodness-of-fit (MGF) and the Maximum Entropy (POME) technique, the focus of this manuscript. By way of illustration adjustments were made for the Generalized Pareto distribution, for a sequence of earthquakes intraplacas which occurred in the city of João Câmara in the northeastern region of Brazil, which was monitored continuously for two years (1987 and 1988). It was found that the MLE and POME were the most efficient methods, giving them basically mean squared errors. Based on the threshold of 1.5 degrees was estimated the seismic risk for the city, and estimated the level of return to earthquakes of intensity 1.5°, 2.0°, 2.5°, 3.0° and the most intense earthquake never registered in the city, which occurred in November 1986 with magnitude of about 5.2º