980 resultados para Random matrix theory


Relevância:

30.00% 30.00%

Publicador:

Resumo:

We investigate the influence of the driving mechanism on the hysteretic response of systems with athermal dynamics. In the framework of local mean-field theory at finite temperature (but neglecting thermally activated processes), we compare the rate-independent hysteresis loops obtained in the random field Ising model when controlling either the external magnetic field H or the extensive magnetization M. Two distinct behaviors are observed, depending on disorder strength. At large disorder, the H-driven and M-driven protocols yield identical hysteresis loops in the thermodynamic limit. At low disorder, when the H-driven magnetization curve is discontinuous (due to the presence of a macroscopic avalanche), the M-driven loop is reentrant while the induced field exhibits strong intermittent fluctuations and is only weakly self-averaging. The relevance of these results to the experimental observations in ferromagnetic materials, shape memory alloys, and other disordered systems is discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The mean-field theory of a spin glass with a specific form of nearest- and next-nearest-neighbor interactions is investigated. Depending on the sign of the interaction matrix chosen we find either the continuous replica symmetry breaking seen in the Sherrington-Kirkpartick model or a one-step solution similar to that found in structural glasses. Our results are confirmed by numerical simulations and the link between the type of spin-glass behavior and the density of eigenvalues of the interaction matrix is discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this thesis we study the effect of rest periods in queueing systems without exhaustive service and inventory systems with rest to the server. Most of the works in the vacation models deal with exhaustive service. Recently some results have appeared for the systems without exhaustive service.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this thesis we attempt to make a probabilistic analysis of some physically realizable, though complex, storage and queueing models. It is essentially a mathematical study of the stochastic processes underlying these models. Our aim is to have an improved understanding of the behaviour of such models, that may widen their applicability. Different inventory systems with randon1 lead times, vacation to the server, bulk demands, varying ordering levels, etc. are considered. Also we study some finite and infinite capacity queueing systems with bulk service and vacation to the server and obtain the transient solution in certain cases. Each chapter in the thesis is provided with self introduction and some important references

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thomas-Fermi theory is developed to evaluate nuclear matrix elements averaged on the energy shell, on the basis of independent particle Hamiltonians. One- and two-body matrix elements are compared with the quantal results, and it is demonstrated that the semiclassical matrix elements, as function of energy, well pass through the average of the scattered quantum values. For the one-body matrix elements it is shown how the Thomas-Fermi approach can be projected on good parity and also on good angular momentum. For the two-body case, the pairing matrix elements are considered explicitly.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In many situations probability models are more realistic than deterministic models. Several phenomena occurring in physics are studied as random phenomena changing with time and space. Stochastic processes originated from the needs of physicists.Let X(t) be a random variable where t is a parameter assuming values from the set T. Then the collection of random variables {X(t), t ∈ T} is called a stochastic process. We denote the state of the process at time t by X(t) and the collection of all possible values X(t) can assume, is called state space

Relevância:

30.00% 30.00%

Publicador:

Resumo:

First discussion on compositional data analysis is attributable to Karl Pearson, in 1897. However, notwithstanding the recent developments on algebraic structure of the simplex, more than twenty years after Aitchison’s idea of log-transformations of closed data, scientific literature is again full of statistical treatments of this type of data by using traditional methodologies. This is particularly true in environmental geochemistry where besides the problem of the closure, the spatial structure (dependence) of the data have to be considered. In this work we propose the use of log-contrast values, obtained by a simplicial principal component analysis, as LQGLFDWRUV of given environmental conditions. The investigation of the log-constrast frequency distributions allows pointing out the statistical laws able to generate the values and to govern their variability. The changes, if compared, for example, with the mean values of the random variables assumed as models, or other reference parameters, allow defining monitors to be used to assess the extent of possible environmental contamination. Case study on running and ground waters from Chiavenna Valley (Northern Italy) by using Na+, K+, Ca2+, Mg2+, HCO3-, SO4 2- and Cl- concentrations will be illustrated

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this class, we will discuss network theory fundamentals, including concepts such as diameter, distance, clustering coefficient and others. We will also discuss different types of networks, such as scale-free networks, random networks etc. Readings: Graph structure in the Web, A. Broder and R. Kumar and F. Maghoul and P. Raghavan and S. Rajagopalan and R. Stata and A. Tomkins and J. Wiener Computer Networks 33 309--320 (2000) [Web link, Alternative Link] Optional: The Structure and Function of Complex Networks, M.E.J. Newman, SIAM Review 45 167--256 (2003) [Web link] Original course at: http://kmi.tugraz.at/staff/markus/courses/SS2008/707.000_web-science/

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In populational sampling it is vitally important to clarify and discern: first, the design or sampling method used to solve the research problem; second, the sampling size, taking into account different components (precision, reliability, variance); third, random selection and fourth, the precision estimate (sampling errors), so as to determine if it is possible to infer the obtained estimates from the target population. The existing difficulty to use concepts from the sampling theory is to understand them with absolute clarity and, to achieve it, the help from didactic-pedagogical strategies arranged as conceptual “mentefactos” (simple hierarchic diagrams organized from propositions) may prove useful. This paper presents the conceptual definition, through conceptual “mentefactos”, of the most important populational probabilistic sampling concepts, in order to obtain representative samples from populations in health research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We study the role of natural resource windfalls in explaining the efficiency of public expenditures. Using a rich dataset of expenditures and public good provision for 1,836 municipalities in Peru for period 2001-2010, we estimate a non-monotonic relationship between the efficiency of public good provision and the level of natural resource transfers. Local governments that were extremely favored by the boom of mineral prices were more efficient in using fiscal windfalls whereas those benefited with modest transfers were more inefficient. These results can be explained by the increase in political competition associated with the boom. However, the fact that increases in efficiency were related to reductions in public good provision casts doubts about the beneficial effects of political competition in promoting efficiency.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Bimodal dispersal probability distributions with characteristic distances differing by several orders of magnitude have been derived and favorably compared to observations by Nathan [Nature (London) 418, 409 (2002)]. For such bimodal kernels, we show that two-dimensional molecular dynamics computer simulations are unable to yield accurate front speeds. Analytically, the usual continuous-space random walks (CSRWs) are applied to two dimensions. We also introduce discrete-space random walks and use them to check the CSRW results (because of the inefficiency of the numerical simulations). The physical results reported are shown to predict front speeds high enough to possibly explain Reid's paradox of rapid tree migration. We also show that, for a time-ordered evolution equation, fronts are always slower in two dimensions than in one dimension and that this difference is important both for unimodal and for bimodal kernels

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The eclectic paradigm of Dunning (1980) (with its OLI and four motives for FDI framework) can be reconciled with the firm and country matrix of Rugman (1981). However, the fit is not perfect. The main reason for misalignment is that Dunning is focused upon outward FDI into host economies, whereas Rugman’s matrix is for firm-level strategy covering MNE activity in both home and host countries.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The calculation of accurate and reliable vibrational potential functions and normal co-ordinates is discussed, for such simple polyatomic molecules as it may be possible. Such calculations should be corrected for the effects of anharmonicity and of resonance interactions between the vibrational states, and should be fitted to all the available information on all isotopic species: particularly the vibrational frequencies, Coriolis zeta constants and centrifugal distortion constants. The difficulties of making these corrections, and of making use of the observed data are reviewed. A programme for the Ferranti Mercury Computer is described by means of which harmonic vibration frequencies and normal co-ordinate vectors, zeta factors and centrifugal distortion constants can be calculated, from a given force field and from given G-matrix elements, etc. The programme has been used on up to 5 × 5 secular equations for which a single calculation and output of results takes approximately l min; it can readily be extended to larger determinants. The best methods of using such a programme and the possibility of reversing the direction of calculation are discussed. The methods are applied to calculating the best possible vibrational potential function for the methane molecule, making use of all the observed data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The molecular structures of NbOBr3, NbSCl3, and NbSBr3 have been determined by gas-phase electron diffraction (GED) at nozzle-tip temperatures of 250 degreesC, taking into account the possible presence of NbOCl3 as a contaminant in the NbSCl3 sample and NbOBr3 in the NbSBr3 sample. The experimental data are consistent with trigonal-pyramidal molecules having C-3v symmetry. Infrared spectra of molecules trapped in argon or nitrogen matrices were recorded and exhibit the characteristic fundamental stretching modes for C-3v species. Well resolved isotopic fine structure (Cl-35 and Cl-37) was observed for NbSCl3, and for NbOCl3 which occurred as an impurity in the NbSCl3 spectra. Quantum mechanical calculations of the structures and vibrational frequencies of the four YNbX3 molecules (Y = O, S; X = Cl, Br) were carried out at several levels of theory, most importantly B3LYP DFT with either the Stuttgart RSC ECP or Hay-Wadt (n + 1) ECP VDZ basis set for Nb and the 6-311 G* basis set for the nonmetal atoms. Theoretical values for the bond lengths are 0.01-0.04 Angstrom longer than the experimental ones of type r(a), in accord with general experience, but the bond angles with theoretical minus experimental differences of only 1.0-1.5degrees are notably accurate. Symmetrized force fields were also calculated. The experimental bond lengths (r(g)/Angstrom) and angles (angle(alpha)/deg) with estimated 2sigma uncertainties from GED are as follows. NbOBr3: r(Nb=O) = 1.694(7), r(Nb-Br) = 2.429(2), angle(O=Nb-Br) = 107.3(5), angle(Br-Nb-Br) = 111.5(5). NbSBr3: r(Nb=S) = 2.134(10), r(Nb-Br) = 2.408(4), angle(S=Nb-Br) = 106.6(7), angle(Br-Nb-Br) = 112.2(6). NbSCl3: Nb=S) = 2.120(10), r(Nb-Cl) = 2.271(6), angle(S=Nb-Cl) = 107.8(12), angle(Cl-Nb-Cl) = 111.1(11).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work, IR thermography is used as a non-destructive tool for impact damage characterisation on thermoplastic E-glass/polypropylene composites for automotive applications. The aim of this experimentation was to compare impact resistance and to characterise damage patterns of different laminates, in order to provide indications for their use in components. Two E-glass/polypropylene composites, commingled ®Twintex (with three different weave structures: directional, balanced and 3-D) and random reinforced GMT, were in particular characterised. Directional and balanced Twintex were also coupled in a number of hybrid configurations with GMT to evaluate the possible use of GMT/Twintex hybrids in high-energy absorption components. The laminates were impacted using a falling weight tower, with impact energies ranging from 15 J to penetration. Using IR thermography during cooling down following a long pulse (3 s), impact damaged areas were characterised and the influence of weave structure on damage patterns was studied. IR thermography offered good accuracy for laminates with thickness not exceeding 3.5 mm: this appears to be a limit for the direct use of this method on components, where more refined signal treatment would probably be needed for impact damage characterisation.