887 resultados para Non-gaussian statistical mechanics
Resumo:
Systems whose spectra are fractals or multifractals have received a lot of attention in recent years. The complete understanding of the behavior of many physical properties of these systems is still far from being complete because of the complexity of such systems. Thus, new applications and new methods of study of their spectra have been proposed and consequently a light has been thrown on their properties, enabling a better understanding of these systems. We present in this work initially the basic and necessary theoretical framework regarding the calculation of energy spectrum of elementary excitations in some systems, especially in quasiperiodic ones. Later we show, by using the Schr¨odinger equation in tight-binding approximation, the results for the specific heat of electrons within the statistical mechanics of Boltzmann-Gibbs for one-dimensional quasiperiodic systems, growth by following the Fibonacci and Double Period rules. Structures of this type have already been exploited enough, however the use of non-extensive statistical mechanics proposed by Constantino Tsallis is well suited to systems that have a fractal profile, and therefore our main objective was to apply it to the calculation of thermodynamical quantities, by extending a little more the understanding of the properties of these systems. Accordingly, we calculate, analytical and numerically, the generalized specific heat of electrons in one-dimensional quasiperiodic systems (quasicrystals) generated by the Fibonacci and Double Period sequences. The electronic spectra were obtained by solving the Schr¨odinger equation in the tight-binding approach. Numerical results are presented for the two types of systems with different values of the parameter of nonextensivity q
Resumo:
In non-extensive statistical mechanics [14], it is a nonsense statement to say that the entropy of a system is extensive (or not), without mentioning a law of composition of its elements. In this theory quantum correlations might be perceived through quantum information process. This article, that is an extension of recent work [4], is a comparative study between the entropies of Von Neumann and of Tsallis, with some implementations of the effect of entropy in quantum entanglement, important as a process for transmission of quantum information. We consider two factorized (Fock number) states, which interact through a beam splitter bilinear Hamiltonian with two entries. This comparison showed us that the entropies of Tsallis and Von Neumann behave differently depending on the reflectance of the beam splitter. © 2011 Academic Publications.
Resumo:
The paper provides a review of A.M. Mathai's applications of the theory of special functions, particularly generalized hypergeometric functions, to problems in stellar physics and formation of structure in the Universe and to questions related to reaction, diffusion, and reaction-diffusion models. The essay also highlights Mathai's recent work on entropic, distributional, and differential pathways to basic concepts in statistical mechanics, making use of his earlier research results in information and statistical distribution theory. The results presented in the essay cover a period of time in Mathai's research from 1982 to 2008 and are all related to the thematic area of the gravitationally stabilized solar fusion reactor and fractional reaction-diffusion, taking into account concepts of non-extensive statistical mechanics. The time period referred to above coincides also with Mathai's exceptional contributions to the establishment and operation of the Centre for Mathematical Sciences, India, as well as the holding of the United Nations (UN)/European Space Agency (ESA)/National Aeronautics and Space Administration (NASA) of the United States/ Japanese Aerospace Exploration Agency (JAXA) Workshops on basic space science and the International Heliophysical Year 2007, around the world. Professor Mathai's contributions to the latter, since 1991, are a testimony for his social con-science applied to international scientific activity.
Resumo:
Solar activity indicators, each as sunspot numbers, sunspot area and flares, over the Sun’s photosphere are not considered to be symmetric between the northern and southern hemispheres of the Sun. This behavior is also known as the North-South Asymmetry of the different solar indices. Among the different conclusions obtained by several authors, we can point that the N-S asymmetry is a real and systematic phenomenon and is not due to random variability. In the present work, the probability distributions from the Marshall Space Flight Centre (MSFC) database are investigated using a statistical tool arises from well-known Non-Extensive Statistical Mechanics proposed by C. Tsallis in 1988. We present our results and discuss their physical implications with the help of theoretical model and observations. We obtained that there is a strong dependence between the nonextensive entropic parameter q and long-term solar variability presents in the sunspot area data. Among the most important results, we highlight that the asymmetry index q reveals the dominance of the North against the South. This behavior has been discussed and confirmed by several authors, but in no time they have given such behavior to a statistical model property. Thus, we conclude that this parameter can be considered as an effective measure for diagnosing long-term variations of solar dynamo. Finally, our dissertation opens a new approach for investigating time series in astrophysics from the perspective of non-extensivity.
Resumo:
The pioneering work proposed by Skumanich (1972) has shown that the projected mean rotational velocity < v sini > for solar type stars follows a rotation law decreases with the time given by t −1/2 , where t is the stellar age. This relationship is consistent with the theories of the angular momentum loss through the ionized stellar wind, which in turn is coupled to the star through its magnetic field. Several authors (e.g.: Silva et al. 2013 and de Freitas et al. 2014) have analyzed the possible matches between the rotational decay and the profile of the velocity distribution. These authors came to a simple heuristic relationship, but did not build a direct path between the exponent of the rotational decay (j) and the exponent of the distribution of the rotational velocity (q). The whole theoretical scenario has been proposed using an efficient and strong statistical mechanics well known as non-extensive statistical mechanics. The present dissertation proposes effectively to close this issue by elaborating a theoretical way to modify the q-Maxwellians’ distributions into q-Maxwellians with physics links extracted from the theory of magnetic braking. In order to test our distributions we have used the GenevaCapenhagen Survey data with approximately 6000 F and G field stars limited by age. As a result, we obtained that the exponents of the decay law and distribution follow a similar relationship to that proposed by Silva et al. (2013).
Resumo:
A technique is developed to study random vibration of nonlinear systems. The method is based on the assumption that the joint probability density function of the response variables and input variables is Gaussian. It is shown that this method is more general than the statistical linearization technique in that it can handle non-Gaussian excitations and amplitude-limited responses. As an example a bilinear hysteretic system under white noise excitation is analyzed. The prediction of various response statistics by this technique is in good agreement with other available results.
Resumo:
A framework for adaptive and non-adaptive statistical compressive sensing is developed, where a statistical model replaces the standard sparsity model of classical compressive sensing. We propose within this framework optimal task-specific sensing protocols specifically and jointly designed for classification and reconstruction. A two-step adaptive sensing paradigm is developed, where online sensing is applied to detect the signal class in the first step, followed by a reconstruction step adapted to the detected class and the observed samples. The approach is based on information theory, here tailored for Gaussian mixture models (GMMs), where an information-theoretic objective relationship between the sensed signals and a representation of the specific task of interest is maximized. Experimental results using synthetic signals, Landsat satellite attributes, and natural images of different sizes and with different noise levels show the improvements achieved using the proposed framework when compared to more standard sensing protocols. The underlying formulation can be applied beyond GMMs, at the price of higher mathematical and computational complexity. © 1991-2012 IEEE.
Resumo:
This paper presents a robust stochastic framework for the incorporation of visual observations into conventional estimation, data fusion, navigation and control algorithms. The representation combines Isomap, a non-linear dimensionality reduction algorithm, with expectation maximization, a statistical learning scheme. The joint probability distribution of this representation is computed offline based on existing training data. The training phase of the algorithm results in a nonlinear and non-Gaussian likelihood model of natural features conditioned on the underlying visual states. This generative model can be used online to instantiate likelihoods corresponding to observed visual features in real-time. The instantiated likelihoods are expressed as a Gaussian mixture model and are conveniently integrated within existing non-linear filtering algorithms. Example applications based on real visual data from heterogenous, unstructured environments demonstrate the versatility of the generative models.
Resumo:
Purpose. To create a binocular statistical eye model based on previously measured ocular biometric data. Methods. Thirty-nine parameters were determined for a group of 127 healthy subjects (37 male, 90 female; 96.8% Caucasian) with an average age of 39.9 ± 12.2 years and spherical equivalent refraction of −0.98 ± 1.77 D. These parameters described the biometry of both eyes and the subjects' age. Missing parameters were complemented by data from a previously published study. After confirmation of the Gaussian shape of their distributions, these parameters were used to calculate their mean and covariance matrices. These matrices were then used to calculate a multivariate Gaussian distribution. From this, an amount of random biometric data could be generated, which were then randomly selected to create a realistic population of random eyes. Results. All parameters had Gaussian distributions, with the exception of the parameters that describe total refraction (i.e., three parameters per eye). After these non-Gaussian parameters were omitted from the model, the generated data were found to be statistically indistinguishable from the original data for the remaining 33 parameters (TOST [two one-sided t tests]; P < 0.01). Parameters derived from the generated data were also significantly indistinguishable from those calculated with the original data (P > 0.05). The only exception to this was the lens refractive index, for which the generated data had a significantly larger SD. Conclusions. A statistical eye model can describe the biometric variations found in a population and is a useful addition to the classic eye models.
Resumo:
Hydraulic conductivity (K) fields are used to parameterize groundwater flow and transport models. Numerical simulations require a detailed representation of the K field, synthesized to interpolate between available data. Several recent studies introduced high-resolution K data (HRK) at the Macro Dispersion Experiment (MADE) site, and used ground-penetrating radar (GPR) to delineate the main structural features of the aquifer. This paper describes a statistical analysis of these data, and the implications for K field modeling in alluvial aquifers. Two striking observations have emerged from this analysis. The first is that a simple fractional difference filter can have a profound effect on data histograms, organizing non-Gaussian ln K data into a coherent distribution. The second is that using GPR facies allows us to reproduce the significantly non-Gaussian shape seen in real HRK data profiles, using a simulated Gaussian ln K field in each facies. This illuminates a current controversy in the literature, between those who favor Gaussian ln K models, and those who observe non-Gaussian ln K fields. Both camps are correct, but at different scales.
Resumo:
We report a universal large deviation behavior of spatially averaged global injected power just before the rejuvenation of the jammed state formed by an aging suspension of laponite clay under an applied stress. The probability distribution function (PDF) of these entropy consuming strongly non-Gaussian fluctuations follow an universal large deviation functional form described by the generalized Gumbel (GG) distribution like many other equilibrium and nonequilibrium systems with high degree of correlations but do not obey the Gallavotti-Cohen steady-state fluctuation relation (SSFR). However, far from the unjamming transition (for smaller applied stresses) SSFR is satisfied for both Gaussian as well as non-Gaussian PDF. The observed slow variation of the mean shear rate with system size supports a recent theoretical prediction for observing GG distribution.
Resumo:
In this paper, motivated by observations of non-exponential decay times in the stochastic binding and release of ligand-receptor systems, exemplified by the work of Rogers et al on optically trapped DNA-coated colloids (Rogers et al 2013 Soft Matter 9 6412), we explore the general problem of polymer-mediated surface adhesion using a simplified model of the phenomenon in which a single polymer molecule, fixed at one end, binds through a ligand at its opposite end to a flat surface a fixed distance L away and uniformly covered with receptor sites. Working within the Wilemski-Fixman approximation to diffusion-controlled reactions, we show that for a flexible Gaussian chain, the predicted distribution of times f(t) for which the ligand and receptor are bound is given, for times much shorter than the longest relaxation time of the polymer, by a power law of the form t(-1/4). We also show when the effects of chain stiffness are incorporated into this model (approximately), the structure of f(t) is altered to t(-1/2). These results broadly mirror the experimental trends in the work cited above.
Resumo:
The epidemic of HIV/AIDS in the United States is constantly changing and evolving, starting from patient zero to now an estimated 650,000 to 900,000 Americans infected. The nature and course of HIV changed dramatically with the introduction of antiretrovirals. This discourse examines many different facets of HIV from the beginning where there wasn't any treatment for HIV until the present era of highly active antiretroviral therapy (HAART). By utilizing statistical analysis of clinical data, this paper examines where we were, where we are and projections as to where treatment of HIV/AIDS is headed.
Chapter Two describes the datasets that were used for the analyses. The primary database utilized was collected by myself from an outpatient HIV clinic. The data included dates from 1984 until the present. The second database was from the Multicenter AIDS Cohort Study (MACS) public dataset. The data from the MACS cover the time between 1984 and October 1992. Comparisons are made between both datasets.
Chapter Three discusses where we were. Before the first anti-HIV drugs (called antiretrovirals) were approved, there was no treatment to slow the progression of HIV. The first generation of antiretrovirals, reverse transcriptase inhibitors such as AZT (zidovudine), DDI (didanosine), DDC (zalcitabine), and D4T (stavudine) provided the first treatment for HIV. The first clinical trials showed that these antiretrovirals had a significant impact on increasing patient survival. The trials also showed that patients on these drugs had increased CD4+ T cell counts. Chapter Three examines the distributions of CD4 T cell counts. The results show that the estimated distributions of CD4 T cell counts are distinctly non-Gaussian. Thus distributional assumptions regarding CD4 T cell counts must be taken, into account when performing analyses with this marker. The results also show the estimated CD4 T cell distributions for each disease stage: asymptomatic, symptomatic and AIDS are non-Gaussian. Interestingly, the distribution of CD4 T cell counts for the asymptomatic period is significantly below that of the CD4 T cell distribution for the uninfected population suggesting that even in patients with no outward symptoms of HIV infection, there exists high levels of immunosuppression.
Chapter Four discusses where we are at present. HIV quickly grew resistant to reverse transcriptase inhibitors which were given sequentially as mono or dual therapy. As resistance grew, the positive effects of the reverse transcriptase inhibitors on CD4 T cell counts and survival dissipated. As the old era faded a new era characterized by a new class of drugs and new technology changed the way that we treat HIV-infected patients. Viral load assays were able to quantify the levels of HIV RNA in the blood. By quantifying the viral load, one now had a faster, more direct way to test antiretroviral regimen efficacy. Protease inhibitors, which attacked a different region of HIV than reverse transcriptase inhibitors, when used in combination with other antiretroviral agents were found to dramatically and significantly reduce the HIV RNA levels in the blood. Patients also experienced significant increases in CD4 T cell counts. For the first time in the epidemic, there was hope. It was hypothesized that with HAART, viral levels could be kept so low that the immune system as measured by CD4 T cell counts would be able to recover. If these viral levels could be kept low enough, it would be possible for the immune system to eradicate the virus. The hypothesis of immune reconstitution, that is bringing CD4 T cell counts up to levels seen in uninfected patients, is tested in Chapter Four. It was found that for these patients, there was not enough of a CD4 T cell increase to be consistent with the hypothesis of immune reconstitution.
In Chapter Five, the effectiveness of long-term HAART is analyzed. Survival analysis was conducted on 213 patients on long-term HAART. The primary endpoint was presence of an AIDS defining illness. A high level of clinical failure, or progression to an endpoint, was found.
Chapter Six yields insights into where we are going. New technology such as viral genotypic testing, that looks at the genetic structure of HIV and determines where mutations have occurred, has shown that HIV is capable of producing resistance mutations that confer multiple drug resistance. This section looks at resistance issues and speculates, ceterus parabis, where the state of HIV is going. This section first addresses viral genotype and the correlates of viral load and disease progression. A second analysis looks at patients who have failed their primary attempts at HAART and subsequent salvage therapy. It was found that salvage regimens, efforts to control viral replication through the administration of different combinations of antiretrovirals, were not effective in 90 percent of the population in controlling viral replication. Thus, primary attempts at therapy offer the best change of viral suppression and delay of disease progression. Documentation of transmission of drug-resistant virus suggests that the public health crisis of HIV is far from over. Drug resistant HIV can sustain the epidemic and hamper our efforts to treat HIV infection. The data presented suggest that the decrease in the morbidity and mortality due to HIV/AIDS is transient. Deaths due to HIV will increase and public health officials must prepare for this eventuality unless new treatments become available. These results also underscore the importance of the vaccine effort.
The final chapter looks at the economic issues related to HIV. The direct and indirect costs of treating HIV/AIDS are very high. For the first time in the epidemic, there exists treatment that can actually slow disease progression. The direct costs for HAART are estimated. It is estimated that the direct lifetime costs for treating each HIV infected patient with HAART is between $353,000 to $598,000 depending on how long HAART prolongs life. If one looks at the incremental cost per year of life saved it is only $101,000. This is comparable with the incremental costs per year of life saved from coronary artery bypass surgery.
Policy makers need to be aware that although HAART can delay disease progression, it is not a cure and HIV is not over. The results presented here suggest that the decreases in the morbidity and mortality due to HIV are transient. Policymakers need to be prepared for the eventual increase in AIDS incidence and mortality. Costs associated with HIV/AIDS are also projected to increase. The cost savings seen recently have been from the dramatic decreases in the incidence of AIDS defining opportunistic infections. As patients who have been on HAART the longest start to progress to AIDS, policymakers and insurance companies will find that the cost of treating HIV/AIDS will increase.
Resumo:
The objectives of this study were to determine the fracture toughness of adhesive interfaces between dentine and clinically relevant, thin layers of dental luting cements. Cements tested included a conventional glass-ionomer, F (Fuji I), a resin-modified glass-ionomer, FP (Fuji Plus) and a compomer cement, D (DyractCem). Ten miniature short-bar chevron notch specimens were manufactured for each cement, each comprising a 40 µm thick chevron of lute, between two 1.5 mm thick blocks of bovine dentine, encased in resin composite. The interfacial KIC results (MN/m3/2) were median (range): F; 0.152 (0.14-0.16), FP; 0.306 (0.27-0.37), D; 0.351 (0.31-0.37). Non-parametric statistical analysis showed that the fracture toughness of F was significantly lower (p
Resumo:
The climate belongs to the class of non-equilibrium forced and dissipative systems, for which most results of quasi-equilibrium statistical mechanics, including the fluctuation-dissipation theorem, do not apply. In this paper we show for the first time how the Ruelle linear response theory, developed for studying rigorously the impact of perturbations on general observables of non-equilibrium statistical mechanical systems, can be applied with great success to analyze the climatic response to general forcings. The crucial value of the Ruelle theory lies in the fact that it allows to compute the response of the system in terms of expectation values of explicit and computable functions of the phase space averaged over the invariant measure of the unperturbed state. We choose as test bed a classical version of the Lorenz 96 model, which, in spite of its simplicity, has a well-recognized prototypical value as it is a spatially extended one-dimensional model and presents the basic ingredients, such as dissipation, advection and the presence of an external forcing, of the actual atmosphere. We recapitulate the main aspects of the general response theory and propose some new general results. We then analyze the frequency dependence of the response of both local and global observables to perturbations having localized as well as global spatial patterns. We derive analytically several properties of the corresponding susceptibilities, such as asymptotic behavior, validity of Kramers-Kronig relations, and sum rules, whose main ingredient is the causality principle. We show that all the coefficients of the leading asymptotic expansions as well as the integral constraints can be written as linear function of parameters that describe the unperturbed properties of the system, such as its average energy. Some newly obtained empirical closure equations for such parameters allow to define such properties as an explicit function of the unperturbed forcing parameter alone for a general class of chaotic Lorenz 96 models. We then verify the theoretical predictions from the outputs of the simulations up to a high degree of precision. The theory is used to explain differences in the response of local and global observables, to define the intensive properties of the system, which do not depend on the spatial resolution of the Lorenz 96 model, and to generalize the concept of climate sensitivity to all time scales. We also show how to reconstruct the linear Green function, which maps perturbations of general time patterns into changes in the expectation value of the considered observable for finite as well as infinite time. Finally, we propose a simple yet general methodology to study general Climate Change problems on virtually any time scale by resorting to only well selected simulations, and by taking full advantage of ensemble methods. The specific case of globally averaged surface temperature response to a general pattern of change of the CO2 concentration is discussed. We believe that the proposed approach may constitute a mathematically rigorous and practically very effective way to approach the problem of climate sensitivity, climate prediction, and climate change from a radically new perspective.