1000 resultados para CNPQ::CIENCIAS EXATAS E DA TERRA::GEOCIENCIAS::GEOLOGIA
Resumo:
The research behind this master dissertation started with the installation of a DC sputtering system, from its first stage, the adaptation of a refrigerating system, passing by the introduction of a heating system for the chamber using a thermal belt, until the deposition of a series of Fe/MgO(100) single crystal nanometric film samples. The deposition rates of some materials such as Fe, Py and Cu were investigated through an Atomic Force Microscope (AFM). For the single crystal samples, five of them have the same growth parameters and a thickness of 250Å, except for the temperature, which varies from fifty degrees from one to another, from 100ºC to 300ºC. Three other samples also have the same deposition parameters and a temperature of 300ºC, but with thickness of 62,5Å, 150Å, and 250Å. Magneto-optical Kerr Effect (MOKE) of the magnetic curves measurements and Ferromagnetic Resonance (FMR) were made to in order to study the influence of the temperature and thickness on the sample s magnetic properties. In the present dissertation we discuss such techniques, and the experimental results are interpreted using phenomenological models, by simulation, and discussed from a physical point of view, taking into account the system s free magnetic energy terms. The results show the growth of the cubic anisotropy field (Hac) as the sample s deposition temperature increases, presenting an asymptotic behavior, similar to the characteristic charging curve of a capacitor in a RC circuit. A similar behavior was also observed for the Hac due to the increase in the samples thicknesses. The 250˚A sample, growth at 300°C, presented a Hac field close to the Fe bulk value
Resumo:
In this work we present a theoretical study about the properties of magnetic polaritons in superlattices arranged in a periodic and quasiperiodic fashíons. In the periodic superlattice, in order to describe the behavior of the bulk and surface modes an effective medium approach, was used that simplify enormously the algebra involved. The quasi-periodic superlattice was described by a suitable theoretical model based on a transfer-matrix treatment, to derive the polariton's dispersion relation, using Maxwell's equations (including effect of retardation). Here, we find a fractal spectra characterized by a power law for the distribution of the energy bandwidths. The localization and scaling behavior of the quasiperiodic structure were studied for a geometry where the wave vector and the external applied magnetic field are in the same plane (Voigt geometry). Numerical results are presented for the ferromagnet Fe and for the metamagnets FeBr2 and FeCl2
Resumo:
Understanding the way in which large-scale structures, like galaxies, form remains one of the most challenging problems in cosmology today. The standard theory for the origin of these structures is that they grew by gravitational instability from small, perhaps quantum generated, °uctuations in the density of dark matter, baryons and photons over an uniform primordial Universe. After the recombination, the baryons began to fall into the pre-existing gravitational potential wells of the dark matter. In this dissertation a study is initially made of the primordial recombination era, the epoch of the formation of the neutral hydrogen atoms. Besides, we analyzed the evolution of the density contrast (of baryonic and dark matter), in clouds of dark matter with masses among 104M¯ ¡ 1010M¯. In particular, we take into account the several physical mechanisms that act in the baryonic component, during and after the recombination era. The analysis of the formation of these primordial objects was made in the context of three models of dark energy as background: Quintessence, ¤CDM(Cosmological Constant plus Cold Dark Matter) and Phantom. We show that the dark matter is the fundamental agent for the formation of the structures observed today. The dark energy has great importance at that epoch of its formation
Resumo:
In this work, we study and compare two percolation algorithms, one of then elaborated by Elias, and the other one by Newman and Ziff, using theorical tools of algorithms complexity and another algorithm that makes an experimental comparation. This work is divided in three chapters. The first one approaches some necessary definitions and theorems to a more formal mathematical study of percolation. The second presents technics that were used for the estimative calculation of the algorithms complexity, are they: worse case, better case e average case. We use the technique of the worse case to estimate the complexity of both algorithms and thus we can compare them. The last chapter shows several characteristics of each one of the algorithms and through the theoretical estimate of the complexity and the comparison between the execution time of the most important part of each one, we can compare these important algorithms that simulate the percolation.
Resumo:
In this work, we study the survival cure rate model proposed by Yakovlev et al. (1993), based on a competing risks structure concurring to cause the event of interest, and the approach proposed by Chen et al. (1999), where covariates are introduced to model the risk amount. We focus the measurement error covariates topics, considering the use of corrected score method in order to obtain consistent estimators. A simulation study is done to evaluate the behavior of the estimators obtained by this method for finite samples. The simulation aims to identify not only the impact on the regression coefficients of the covariates measured with error (Mizoi et al. 2007) but also on the coefficients of covariates measured without error. We also verify the adequacy of the piecewise exponential distribution to the cure rate model with measurement error. At the end, model applications involving real data are made
Resumo:
In this work we presented an exhibition of the mathematical theory of orthogonal compact support wavelets in the context of multiresoluction analysis. These are particularly attractive wavelets because they lead to a stable and very efficient algorithm, that is Fast Transform Wavelet (FWT). One of our objectives is to develop efficient algorithms for calculating the coefficients wavelet (FWT) through the pyramid algorithm of Mallat and to discuss his connection with filters Banks. We also studied the concept of multiresoluction analysis, that is the context in that wavelets can be understood and built naturally, taking an important step in the change from the Mathematical universe (Continuous Domain) for the Universe of the representation (Discret Domain)
Resumo:
The work is to make a brief discussion of methods to estimate the parameters of the Generalized Pareto distribution (GPD). Being addressed the following techniques: Moments (moments), Maximum Likelihood (MLE), Biased Probability Weighted Moments (PWMB), Unbiased Probability Weighted Moments (PWMU), Mean Power Density Divergence (MDPD), Median (MED), Pickands (PICKANDS), Maximum Penalized Likelihood (MPLE), Maximum Goodness-of-fit (MGF) and the Maximum Entropy (POME) technique, the focus of this manuscript. By way of illustration adjustments were made for the Generalized Pareto distribution, for a sequence of earthquakes intraplacas which occurred in the city of João Câmara in the northeastern region of Brazil, which was monitored continuously for two years (1987 and 1988). It was found that the MLE and POME were the most efficient methods, giving them basically mean squared errors. Based on the threshold of 1.5 degrees was estimated the seismic risk for the city, and estimated the level of return to earthquakes of intensity 1.5°, 2.0°, 2.5°, 3.0° and the most intense earthquake never registered in the city, which occurred in November 1986 with magnitude of about 5.2º
Resumo:
In this work we study the survival cure rate model proposed by Yakovlev (1993) that are considered in a competing risk setting. Covariates are introduced for modeling the cure rate and we allow some covariates to have missing values. We consider only the cases by which the missing covariates are categorical and implement the EM algorithm via the method of weights for maximum likelihood estimation. We present a Monte Carlo simulation experiment to compare the properties of the estimators based on this method with those estimators under the complete case scenario. We also evaluate, in this experiment, the impact in the parameter estimates when we increase the proportion of immune and censored individuals among the not immune one. We demonstrate the proposed methodology with a real data set involving the time until the graduation for the undergraduate course of Statistics of the Universidade Federal do Rio Grande do Norte
Resumo:
The present essay shows strategies of improvement in a well succeded evolutionary metaheuristic to solve the Asymmetric Traveling Salesman Problem. Such steps consist in a Memetic Algorithm projected mainly to this problem. Basically this improvement applied optimizing techniques known as Path-Relinking and Vocabulary Building. Furthermore, this last one has being used in two different ways, in order to evaluate the effects of the improvement on the evolutionary metaheuristic. These methods were implemented in C++ code and the experiments were done under instances at TSPLIB library, being possible to observe that the procedures purposed reached success on the tests done
Resumo:
In this work we present the principal fractals, their caracteristics, properties abd their classification, comparing them to Euclidean Geometry Elements. We show the importance of the Fractal Geometry in the analysis of several elements of our society. We emphasize the importance of an appropriate definition of dimension to these objects, because the definition we presently know doesn t see a satisfactory one. As an instrument to obtain these dimentions we present the Method to count boxes, of Hausdorff- Besicovich and the Scale Method. We also study the Percolation Process in the square lattice, comparing it to percolation in the multifractal subject Qmf, where we observe som differences between these two process. We analize the histogram grafic of the percolating lattices versus the site occupation probability p, and other numerical simulations. And finaly, we show that we can estimate the fractal dimension of the percolation cluster and that the percolatin in a multifractal suport is in the same universality class as standard percolation. We observe that the area of the blocks of Qmf is variable, pc is a function of p which is related to the anisotropy of Qmf
Resumo:
We presented in this work two methods of estimation for accelerated failure time models with random e_ects to process grouped survival data. The _rst method, which is implemented in software SAS, by NLMIXED procedure, uses an adapted Gauss-Hermite quadrature to determine marginalized likelihood. The second method, implemented in the free software R, is based on the method of penalized likelihood to estimate the parameters of the model. In the _rst case we describe the main theoretical aspects and, in the second, we briey presented the approach adopted with a simulation study to investigate the performance of the method. We realized implement the models using actual data on the time of operation of oil wells from the Potiguar Basin (RN / CE).
Resumo:
Present day weather forecast models usually cannot provide realistic descriptions of local and particulary extreme weather conditions. However, for lead times of about a small number of days, they provide reliable forecast of the atmospheric circulation that encompasses the subscale processes leading to extremes. Hence, forecasts of extreme events can only be achieved through a combination of dynamical and statistical analysis methods, where a stable and significant statistical model based on prior physical reasoning establishes posterior statistical-dynamical model between the local extremes and the large scale circulation. Here we present the development and application of such a statistical model calibration on the besis of extreme value theory, in order to derive probabilistic forecast for extreme local temperature. The dowscaling applies to NCEP/NCAR re-analysis, in order to derive estimates of daily temperature at Brazilian northeastern region weather stations
Resumo:
In this work we studied the asymptotic unbiasedness, the strong and the uniform strong consistencies of a class of kernel estimators fn as an estimator of the density function f taking values on a k-dimensional sphere
Resumo:
In this work we have elaborated a spline-based method of solution of inicial value problems involving ordinary differential equations, with emphasis on linear equations. The method can be seen as an alternative for the traditional solvers such as Runge-Kutta, and avoids root calculations in the linear time invariant case. The method is then applied on a central problem of control theory, namely, the step response problem for linear EDOs with possibly varying coefficients, where root calculations do not apply. We have implemented an efficient algorithm which uses exclusively matrix-vector operations. The working interval (till the settling time) was determined through a calculation of the least stable mode using a modified power method. Several variants of the method have been compared by simulation. For general linear problems with fine grid, the proposed method compares favorably with the Euler method. In the time invariant case, where the alternative is root calculation, we have indications that the proposed method is competitive for equations of sifficiently high order.
Resumo:
In this work, the paper of Campos and Dorea [3] was detailed. In that article a Kernel Estimator was applied to a sequence of random variables with general state space, which were independent and identicaly distributed. In chapter 2, the estimator´s properties such as asymptotic unbiasedness, consistency in quadratic mean, strong consistency and asymptotic normality were verified. In chapter 3, using R software, numerical experiments were developed in order to give a visual idea of the estimate process