119 resultados para Partition Theorems
Resumo:
We consider an exclusion process on a ring in which a particle hops to an empty neighboring site with a rate that depends on the number of vacancies n in front of it. In the steady state, using the well-known mapping of this model to the zero-range process, we write down an exact formula for the partition function and the particle-particle correlation function in the canonical ensemble. In the thermodynamic limit, we find a simple analytical expression for the generating function of the correlation function. This result is applied to the hop rate u(n) = 1 + (b/n) for which a phase transition between high-density laminar phase and low-density jammed phase occurs for b > 2. For these rates, we find that at the critical density, the correlation function decays algebraically with a continuously varying exponent b - 2. We also calculate the two-point correlation function above the critical density and find that the correlation length diverges with a critical exponent nu = 1/(b - 2) for b < 3 and 1 for b > 3. These results are compared with those obtained using an exact series expansion for finite systems.
Resumo:
We prove two density theorems for quadrature domains in , . It is shown that quadrature domains are dense in the class of all product domains of the form , where is a smoothly bounded domain satisfying Bell's Condition R and is a smoothly bounded domain and also in the class of all smoothly bounded complete Hartogs domains in C-2.
Resumo:
This article considers a semi-infinite mathematical programming problem with equilibrium constraints (SIMPEC) defined as a semi-infinite mathematical programming problem with complementarity constraints. We establish necessary and sufficient optimality conditions for the (SIMPEC). We also formulate Wolfe- and Mond-Weir-type dual models for (SIMPEC) and establish weak, strong and strict converse duality theorems for (SIMPEC) and the corresponding dual problems under invexity assumptions.
Resumo:
We formulate a natural model of loops and isolated vertices for arbitrary planar graphs, which we call the monopole-dimer model. We show that the partition function of this model can be expressed as a determinant. We then extend the method of Kasteleyn and Temperley-Fisher to calculate the partition function exactly in the case of rectangular grids. This partition function turns out to be a square of a polynomial with positive integer coefficients when the grid lengths are even. Finally, we analyse this formula in the infinite volume limit and show that the local monopole density, free energy and entropy can be expressed in terms of well-known elliptic functions. Our technique is a novel determinantal formula for the partition function of a model of isolated vertices and loops for arbitrary graphs.
Resumo:
We study the free fermion theory in 1+1 dimensions deformed by chemical potentials for holomorphic, conserved currents at finite temperature and on a spatial circle. For a spin-three chemical potential mu, the deformation is related at high temperatures to a higher spin black hole in hs0] theory on AdS(3) spacetime. We calculate the order mu(2) corrections to the single interval Renyi and entanglement entropies on the torus using the bosonized formulation. A consistent result, satisfying all checks, emerges upon carefully accounting for both perturbative and winding mode contributions in the bosonized language. The order mu(2) corrections involve integrals that are finite but potentially sensitive to contact term singularities. We propose and apply a prescription for defining such integrals which matches the Hamiltonian picture and passes several non-trivial checks for both thermal corrections and the Renyi entropies at this order. The thermal corrections are given by a weight six quasi-modular form, whilst the Renyi entropies are controlled by quasi-elliptic functions of the interval length with modular weight six. We also point out the well known connection between the perturbative expansion of the partition function in powers of the spin-three chemical potential and the Gross-Taylor genus expansion of large-N Yang-Mills theory on the torus. We note the absence of winding mode contributions in this connection, which suggests qualitatively different entanglement entropies for the two systems.
Resumo:
The present paper reports a new class of Co based superalloys that has gamma-gamma' microstructure and exhibits much lower density compared to other commercially available Co superalloys including Co-Al-W based alloys. The basic composition is Co-10Al-5Mo (at%) with addition of 2 at% Ta for stabilization of gamma' phase. The gamma-gamma' microstructure evolves through solutionising and aging treatment. Using first principles calculations, we observe that Ta plays a crucial role in stabilizing gamma' phase. By addition of Ta in the basic stoichiometric composition Co-3(Al, Mo), the enthalpy of formation (Delta H-f) of L1(2) structure (gamma' phase) becomes more negative in comparison to DO19 structure. The All of the L12 structure becomes further more negative by the occupancy of Ni and Ti atoms in the lattice suggesting an increase in the stability of the gamma' precipitates. Among large number of alloys studied experimentally, the paper presents results of detailed investigations on Co-10Al-5Mo-2Ta, Co-30Ni-10Al-5Mo-2Ta and Co-30Ni-10Al-5Mo-2Ta-2Ti. To evaluate the role alloying elements, atom probe tomography investigations were carried out to obtain partition coefficients for the constituent elements. The results show strong partitioning of Ni, Al, Ta and Ti in ordered gamma' precipitates. 2015 Acta Materialia Inc. Published by Elsevier Ltd. All rights reserved.
Resumo:
In big data image/video analytics, we encounter the problem of learning an over-complete dictionary for sparse representation from a large training dataset, which cannot be processed at once because of storage and computational constraints. To tackle the problem of dictionary learning in such scenarios, we propose an algorithm that exploits the inherent clustered structure of the training data and make use of a divide-and-conquer approach. The fundamental idea behind the algorithm is to partition the training dataset into smaller clusters, and learn local dictionaries for each cluster. Subsequently, the local dictionaries are merged to form a global dictionary. Merging is done by solving another dictionary learning problem on the atoms of the locally trained dictionaries. This algorithm is referred to as the split-and-merge algorithm. We show that the proposed algorithm is efficient in its usage of memory and computational complexity, and performs on par with the standard learning strategy, which operates on the entire data at a time. As an application, we consider the problem of image denoising. We present a comparative analysis of our algorithm with the standard learning techniques that use the entire database at a time, in terms of training and denoising performance. We observe that the split-and-merge algorithm results in a remarkable reduction of training time, without significantly affecting the denoising performance.
Resumo:
Bearing capacity factors because of the components of cohesion, surcharge, and unit weight, respectively, have been computed for smooth and rough ring footings for different combinations of r(i)= r(o) and. by using lower and upper bound theorems of the limit analysis in conjunction with finite elements and linear optimization, where r(i) and r(o) refer to the inner and outer radii of the ring, respectively. It is observed that for a smooth footing with a given value of r(o), the magnitude of the collapse load decreases continuously with an increase in r(i). Conversely, for a rough base, for a given value of r(o), hardly any reduction occurs in the magnitude of the collapse load up to r(i)= r(o) approximate to 0.2, whereas for r(i)= r(o) > 0.2, the magnitude of the collapse load, similar to that of a smooth footing, decreases continuously with an increase in r(i)= r(o). The results from the analysis compare reasonably well with available theoretical and experimental data from the literature. (C) 2015 American Society of Civil Engineers.
Resumo:
Despite significant advances in recent years, structure-from-motion (SfM) pipelines suffer from two important drawbacks. Apart from requiring significant computational power to solve the large-scale computations involved, such pipelines sometimes fail to correctly reconstruct when the accumulated error in incremental reconstruction is large or when the number of 3D to 2D correspondences are insufficient. In this paper we present a novel approach to mitigate the above-mentioned drawbacks. Using an image match graph based on matching features we partition the image data set into smaller sets or components which are reconstructed independently. Following such reconstructions we utilise the available epipolar relationships that connect images across components to correctly align the individual reconstructions in a global frame of reference. This results in both a significant speed up of at least one order of magnitude and also mitigates the problems of reconstruction failures with a marginal loss in accuracy. The effectiveness of our approach is demonstrated on some large-scale real world data sets.
Resumo:
Homogeneous temperature regions are necessary for use in hydrometeorological studies. The regions are often delineated by analysing statistics derived from time series of maximum, minimum or mean temperature, rather than attributes influencing temperature. This practice cannot yield meaningful regions in data-sparse areas. Further, independent validation of the delineated regions for homogeneity in temperature is not possible, as temperature records form the basis to arrive at the regions. To address these issues, a two-stage clustering approach is proposed in this study to delineate homogeneous temperature regions. First stage of the approach involves (1) determining correlation structure between observed temperature over the study area and possible predictors (large-scale atmospheric variables) influencing the temperature and (2) using the correlation structure as the basis to delineate sites in the study area into clusters. Second stage of the approach involves analysis on each of the clusters to (1) identify potential predictors (large-scale atmospheric variables) influencing temperature at sites in the cluster and (2) partition the cluster into homogeneous fuzzy temperature regions using the identified potential predictors. Application of the proposed approach to India yielded 28 homogeneous regions that were demonstrated to be effective when compared to an alternate set of 6 regions that were previously delineated over the study area. Intersite cross-correlations of monthly maximum and minimum temperatures in the existing regions were found to be weak and negative for several months, which is undesirable. This problem was not found in the case of regions delineated using the proposed approach. Utility of the proposed regions in arriving at estimates of potential evapotranspiration for ungauged locations in the study area is demonstrated.
Resumo:
We study N = 2 compactifications of heterotic string theory on the CHL orbifold (K3 x T-2)/Z(N) with N = 2, 3, 5, 7. Z(N) acts as an automorphism on K3 together with a shift of 1/N along one of the circles of T-2. These compactifications generalize the example of the heterotic string on K3 x T-2 studied in the context of dualities in string theories. We evaluate the new supersymmetric index for these theories and show that their expansion can be written in terms of the McKay-Thompson series associated with the Z(N) automorphism embedded in the Mathieu group M-24. We then evaluate the difference in one-loop threshold corrections to the non-Abelian gauge couplings with Wilson lines and show that their moduli dependence is captured by Siegel modular forms related to dyon partition functions of N = 4 string theories.
Resumo:
We study N = 2 compactifications of heterotic string theory on the CHL orbifold (K3 x T-2)/Z(N) with N = 2, 3, 5, 7. Z(N) acts as an automorphism on K3 together with a shift of 1/N along one of the circles of T-2. These compactifications generalize the example of the heterotic string on K3 x T-2 studied in the context of dualities in string theories. We evaluate the new supersymmetric index for these theories and show that their expansion can be written in terms of the McKay-Thompson series associated with the Z(N) automorphism embedded in the Mathieu group M-24. We then evaluate the difference in one-loop threshold corrections to the non-Abelian gauge couplings with Wilson lines and show that their moduli dependence is captured by Siegel modular forms related to dyon partition functions of N = 4 string theories.
Resumo:
The Restricted Boltzmann Machines (RBM) can be used either as classifiers or as generative models. The quality of the generative RBM is measured through the average log-likelihood on test data. Due to the high computational complexity of evaluating the partition function, exact calculation of test log-likelihood is very difficult. In recent years some estimation methods are suggested for approximate computation of test log-likelihood. In this paper we present an empirical comparison of the main estimation methods, namely, the AIS algorithm for estimating the partition function, the CSL method for directly estimating the log-likelihood, and the RAISE algorithm that combines these two ideas.
Resumo:
The polyhedral model provides an expressive intermediate representation that is convenient for the analysis and subsequent transformation of affine loop nests. Several heuristics exist for achieving complex program transformations in this model. However, there is also considerable scope to utilize this model to tackle the problem of automatic memory footprint optimization. In this paper, we present a new automatic storage optimization technique which can be used to achieve both intra-array as well as inter-array storage reuse with a pre-determined schedule for the computation. Our approach works by finding statement-wise storage partitioning hyper planes that partition a unified global array space so that values with overlapping live ranges are not mapped to the same partition. Our heuristic is driven by a fourfold objective function which not only minimizes the dimensionality and storage requirements of arrays required for each high-level statement, but also maximizes inter statement storage reuse. The storage mappings obtained using our heuristic can be asymptotically better than those obtained by any existing technique. We implement our technique and demonstrate its practical impact by evaluating its effectiveness on several benchmarks chosen from the domains of image processing, stencil computations, and high-performance computing.