7 resultados para computational statistics

em Universidade Federal do Rio Grande do Norte(UFRN)


Relevância:

60.00% 60.00%

Publicador:

Resumo:

This work addresses the production of lightweight concrete building elements, such as plates, prefabricated slabs for pre-molded and panels of fencing, presenting a singular concrete: the Lightweight Concrete, with special properties such low density and good strength, by means of the joint use of industrial waste of thermosetting unsaturated polyesters and biodegradable foaming agent, named Polymeric Lightweight Concrete. This study covered various features of the materials used in the composition of the Polymeric Lightweight Concrete, using a planning of factorial design 23, aiming at studying of the strength, production, dosage processes, characterization of mechanical properties and microstructural analysis of the transition zone between the light artificial aggregate and the matrix of cement. The results of the mechanical strength tests were analyzed using a computational statistics tool (Statistica software) to understand the behavior and obtain the ideal quantity of each material used in the formula of the Polymeric Lightweight Concrete. The definition of the ideal formula has the purpose of obtaining a material with the lowest possible dry density and resistance to compression in accordance with NBR 12.646/92 (≥ 2.5 MPa after 28 days). In the microstructural characterization by scanning electron microscopy it was observed an influence of the materials in the process of cement hydration, showing good interaction between the wrinkled face of the residue of unsaturated polyesters thermosetting and putty and, consequently, the final strength. The attaining of an ideal formula, given the Brazilian standards, the experimental results obtained in the characterization and comparison of these results with conventional materials, confirmed that the developed Polymeric Lightweight Concrete is suitable for the production of building elements that are advantageous for construction

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This work addresses the production of lightweight concrete building elements, such as plates, prefabricated slabs for pre-molded and panels of fencing, presenting a singular concrete: the Lightweight Concrete, with special properties such low density and good strength, by means of the joint use of industrial waste of thermosetting unsaturated polyesters and biodegradable foaming agent, named Polymeric Lightweight Concrete. This study covered various features of the materials used in the composition of the Polymeric Lightweight Concrete, using a planning of factorial design 23, aiming at studying of the strength, production, dosage processes, characterization of mechanical properties and microstructural analysis of the transition zone between the light artificial aggregate and the matrix of cement. The results of the mechanical strength tests were analyzed using a computational statistics tool (Statistica software) to understand the behavior and obtain the ideal quantity of each material used in the formula of the Polymeric Lightweight Concrete. The definition of the ideal formula has the purpose of obtaining a material with the lowest possible dry density and resistance to compression in accordance with NBR 12.646/92 (≥ 2.5 MPa after 28 days). In the microstructural characterization by scanning electron microscopy it was observed an influence of the materials in the process of cement hydration, showing good interaction between the wrinkled face of the residue of unsaturated polyesters thermosetting and putty and, consequently, the final strength. The attaining of an ideal formula, given the Brazilian standards, the experimental results obtained in the characterization and comparison of these results with conventional materials, confirmed that the developed Polymeric Lightweight Concrete is suitable for the production of building elements that are advantageous for construction

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The power-law size distributions obtained experimentally for neuronal avalanches are an important evidence of criticality in the brain. This evidence is supported by the fact that a critical branching process exhibits the same exponent t~3=2. Models at criticality have been employed to mimic avalanche propagation and explain the statistics observed experimentally. However, a crucial aspect of neuronal recordings has been almost completely neglected in the models: undersampling. While in a typical multielectrode array hundreds of neurons are recorded, in the same area of neuronal tissue tens of thousands of neurons can be found. Here we investigate the consequences of undersampling in models with three different topologies (two-dimensional, small-world and random network) and three different dynamical regimes (subcritical, critical and supercritical). We found that undersampling modifies avalanche size distributions, extinguishing the power laws observed in critical systems. Distributions from subcritical systems are also modified, but the shape of the undersampled distributions is more similar to that of a fully sampled system. Undersampled supercritical systems can recover the general characteristics of the fully sampled version, provided that enough neurons are measured. Undersampling in two-dimensional and small-world networks leads to similar effects, while the random network is insensitive to sampling density due to the lack of a well-defined neighborhood. We conjecture that neuronal avalanches recorded from local field potentials avoid undersampling effects due to the nature of this signal, but the same does not hold for spike avalanches. We conclude that undersampled branching-process-like models in these topologies fail to reproduce the statistics of spike avalanches.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this dissertation new models of propagation path loss predictions are proposed by from techniques of optimization recent and measures of power levels for the urban and suburban areas of Natal, city of Brazilian northeast. These new proposed models are: (i) a statistical model that was implemented based in the addition of second-order statistics for the power and the altimetry of the relief in model of linear losses; (ii) a artificial neural networks model used the training of the algorithm backpropagation, in order to get the equation of propagation losses; (iii) a model based on the technique of the random walker, that considers the random of the absorption and the chaos of the environment and than its unknown parameters for the equation of propagation losses are determined through of a neural network. The digitalization of the relief for the urban and suburban areas of Natal were carried through of the development of specific computational programs and had been used available maps in the Statistics and Geography Brazilian Institute. The validations of the proposed propagation models had been carried through comparisons with measures and propagation classic models, and numerical good agreements were observed. These new considered models could be applied to any urban and suburban scenes with characteristic similar architectural to the city of Natal

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Following the new tendency of interdisciplinarity of modern science, a new field called neuroengineering has come to light in the last decades. After 2000, scientific journals and conferences all around the world have been created on this theme. The present work comprises three different subareas related to neuroengineering and electrical engineering: neural stimulation; theoretical and computational neuroscience; and neuronal signal processing; as well as biomedical engineering. The research can be divided in three parts: (i) A new method of neuronal photostimulation was developed based on the use of caged compounds. Using the inhibitory neurotransmitter GABA caged by a ruthenium complex it was possible to block neuronal population activity using a laser pulse. The obtained results were evaluated by Wavelet analysis and tested by non-parametric statistics. (ii) A mathematical method was created to identify neuronal assemblies. Neuronal assemblies were proposed as the basis of learning by Donald Hebb remain the most accepted theory for neuronal representation of external stimuli. Using the Marcenko-Pastur law of eigenvalue distribution it was possible to detect neuronal assemblies and to compute their activity with high temporal resolution. The application of the method in real electrophysiological data revealed that neurons from the neocortex and hippocampus can be part of the same assembly, and that neurons can participate in multiple assemblies. (iii) A new method of automatic classification of heart beats was developed, which does not rely on a data base for training and is not specialized in specific pathologies. The method is based on Wavelet decomposition and normality measures of random variables. Throughout, the results presented in the three fields of knowledge represent qualification in neural and biomedical engineering

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Modern wireless systems employ adaptive techniques to provide high throughput while observing desired coverage, Quality of Service (QoS) and capacity. An alternative to further enhance data rate is to apply cognitive radio concepts, where a system is able to exploit unused spectrum on existing licensed bands by sensing the spectrum and opportunistically access unused portions. Techniques like Automatic Modulation Classification (AMC) could help or be vital for such scenarios. Usually, AMC implementations rely on some form of signal pre-processing, which may introduce a high computational cost or make assumptions about the received signal which may not hold (e.g. Gaussianity of noise). This work proposes a new method to perform AMC which uses a similarity measure from the Information Theoretic Learning (ITL) framework, known as correntropy coefficient. It is capable of extracting similarity measurements over a pair of random processes using higher order statistics, yielding in better similarity estimations than by using e.g. correlation coefficient. Experiments carried out by means of computer simulation show that the technique proposed in this paper presents a high rate success in classification of digital modulation, even in the presence of additive white gaussian noise (AWGN)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The recent astronomical observations indicate that the universe has null spatial curvature, is accelerating and its matter-energy content is composed by circa 30% of matter (baryons + dark matter) and 70% of dark energy, a relativistic component with negative pressure. However, in order to built more realistic models it is necessary to consider the evolution of small density perturbations for explaining the richness of observed structures in the scale of galaxies and clusters of galaxies. The structure formation process was pioneering described by Press and Schechter (PS) in 1974, by means of the galaxy cluster mass function. The PS formalism establishes a Gaussian distribution for the primordial density perturbation field. Besides a serious normalization problem, such an approach does not explain the recent cluster X-ray data, and it is also in disagreement with the most up-to-date computational simulations. In this thesis, we discuss several applications of the nonextensive q-statistics (non-Gaussian), proposed in 1988 by C. Tsallis, with special emphasis in the cosmological process of the large structure formation. Initially, we investigate the statistics of the primordial fluctuation field of the density contrast, since the most recent data from the Wilkinson Microwave Anisotropy Probe (WMAP) indicates a deviation from gaussianity. We assume that such deviations may be described by the nonextensive statistics, because it reduces to the Gaussian distribution in the limit of the free parameter q = 1, thereby allowing a direct comparison with the standard theory. We study its application for a galaxy cluster catalog based on the ROSAT All-Sky Survey (hereafter HIFLUGCS). We conclude that the standard Gaussian model applied to HIFLUGCS does not agree with the most recent data independently obtained by WMAP. Using the nonextensive statistics, we obtain values much more aligned with WMAP results. We also demonstrate that the Burr distribution corrects the normalization problem. The cluster mass function formalism was also investigated in the presence of the dark energy. In this case, constraints over several cosmic parameters was also obtained. The nonextensive statistics was implemented yet in 2 distinct problems: (i) the plasma probe and (ii) in the Bremsstrahlung radiation description (the primary radiation from X-ray clusters); a problem of considerable interest in astrophysics. In another line of development, by using supernova data and the gas mass fraction from galaxy clusters, we discuss a redshift variation of the equation of state parameter, by considering two distinct expansions. An interesting aspect of this work is that the results do not need a prior in the mass parameter, as usually occurs in analyzes involving only supernovae data.Finally, we obtain a new estimate of the Hubble parameter, through a joint analysis involving the Sunyaev-Zeldovich effect (SZE), the X-ray data from galaxy clusters and the baryon acoustic oscillations. We show that the degeneracy of the observational data with respect to the mass parameter is broken when the signature of the baryon acoustic oscillations as given by the Sloan Digital Sky Survey (SDSS) catalog is considered. Our analysis, based on the SZE/X-ray data for a sample of 25 galaxy clusters with triaxial morphology, yields a Hubble parameter in good agreement with the independent studies, provided by the Hubble Space Telescope project and the recent estimates of the WMAP