87 resultados para Critical Sets
Resumo:
In a seminal paper [10], Weitz gave a deterministic fully polynomial approximation scheme for counting exponentially weighted independent sets (which is the same as approximating the partition function of the hard-core model from statistical physics) in graphs of degree at most d, up to the critical activity for the uniqueness of the Gibbs measure on the innite d-regular tree. ore recently Sly [8] (see also [1]) showed that this is optimal in the sense that if here is an FPRAS for the hard-core partition function on graphs of maximum egree d for activities larger than the critical activity on the innite d-regular ree then NP = RP. In this paper we extend Weitz's approach to derive a deterministic fully polynomial approximation scheme for the partition function of general two-state anti-ferromagnetic spin systems on graphs of maximum degree d, up to the corresponding critical point on the d-regular tree. The main ingredient of our result is a proof that for two-state anti-ferromagnetic spin systems on the d-regular tree, weak spatial mixing implies strong spatial mixing. his in turn uses a message-decay argument which extends a similar approach proposed recently for the hard-core model by Restrepo et al [7] to the case of general two-state anti-ferromagnetic spin systems.
Resumo:
Vegeu el resum a l'inici del document del fitxer adjunt.
Resumo:
It has been proposed that the number of tropical cyclones as a function of the energy they release is a decreasing power-law function, up to a characteristic energy cutoff determined by the spatial size of the ocean basin in which the storm occurs. This means that no characteristic scale exists for the energy of tropical cyclones, except for the finite-size effects induced by the boundaries of the basins. This has important implications for the physics of tropical cyclones. We discuss up to what point tropical cyclones are related to critical phenomena (in the same way as earthquakes, rainfall, etc.), providing a consistent picture of the energy balance in the system. Moreover, this perspective allows one to visualize more clearly the effects of global warming on tropical-cyclone occurrence.
Resumo:
The amalgamation operation is frequently used to reduce the number of parts of compositional data but it is a non-linear operation in the simplex with the usual geometry,the Aitchison geometry. The concept of balances between groups, a particular coordinate system designed over binary partitions of the parts, could be an alternative to theamalgamation in some cases. In this work we discuss the proper application of bothconcepts using a real data set corresponding to behavioral measures of pregnant sows
Resumo:
We study the damage enhanced creep rupture of disordered materials by means of a fiber bundle model. Broken fibers undergo a slow stress relaxation modeled by a Maxwell element whose stress exponent m can vary in a broad range. Under global load sharing we show that due to the strength disorder of fibers, the lifetime ʧ of the bundle has sample-to-sample fluctuations characterized by a log-normal distribution independent of the type of disorder. We determine the Monkman-Grant relation of the model and establish a relation between the rupture life tʄ and the characteristic time tm of the intermediate creep regime of the bundle where the minimum strain rate is reached, making possible reliable estimates of ʧ from short term measurements. Approaching macroscopic failure, the deformation rate has a finite time power law singularity whose exponent is a decreasing function of m. On the microlevel the distribution of waiting times is found to have a power law behavior with m-dependent exponents different below and above the critical load of the bundle. Approaching the critical load from above, the cutoff value of the distributions has a power law divergence whose exponent coincides with the stress exponent of Maxwell elements
Resumo:
Topological indices have been applied to build QSAR models for a set of 20 antimalarial cyclic peroxy cetals. In order to evaluate the reliability of the proposed linear models leave-n-out and Internal Test Sets (ITS) approaches have been considered. The proposed procedure resulted in a robust and consensued prediction equation and here it is shown why it is superior to the employed standard cross-validation algorithms involving multilinear regression models
Resumo:
Different procedures to obtain atom condensed Fukui functions are described. It is shown how the resulting values may differ depending on the exact approach to atom condensed Fukui functions. The condensed Fukui function can be computed using either the fragment of molecular response approach or the response of molecular fragment approach. The two approaches are nonequivalent; only the latter approach corresponds in general with a population difference expression. The Mulliken approach does not depend on the approach taken but has some computational drawbacks. The different resulting expressions are tested for a wide set of molecules. In practice one must make seemingly arbitrary choices about how to compute condensed Fukui functions, which suggests questioning the role of these indicators in conceptual density-functional theory
Resumo:
The computational approach to the Hirshfeld [Theor. Chim. Acta 44, 129 (1977)] atom in a molecule is critically investigated, and several difficulties are highlighted. It is shown that these difficulties are mitigated by an alternative, iterative version, of the Hirshfeld partitioning procedure. The iterative scheme ensures that the Hirshfeld definition represents a mathematically proper information entropy, allows the Hirshfeld approach to be used for charged molecules, eliminates arbitrariness in the choice of the promolecule, and increases the magnitudes of the charges. The resulting "Hirshfeld-I charges" correlate well with electrostatic potential derived atomic charges
Resumo:
The effect of basis set superposition error (BSSE) on molecular complexes is analyzed. The BSSE causes artificial delocalizations which modify the first order electron density. The mechanism of this effect is assessed for the hydrogen fluoride dimer with several basis sets. The BSSE-corrected first-order electron density is obtained using the chemical Hamiltonian approach versions of the Roothaan and Kohn-Sham equations. The corrected densities are compared to uncorrected densities based on the charge density critical points. Contour difference maps between BSSE-corrected and uncorrected densities on the molecular plane are also plotted to gain insight into the effects of BSSE correction on the electron density
Resumo:
Arising from either retrotransposition or genomic duplication of functional genes, pseudogenes are “genomic fossils” valuable for exploring the dynamics and evolution of genes and genomes. Pseudogene identification is an important problem in computational genomics, and is also critical for obtaining an accurate picture of a genome’s structure and function. However, no consensus computational scheme for defining and detecting pseudogenes has been developed thus far. As part of the ENCyclopedia Of DNA Elements (ENCODE) project, we have compared several distinct pseudogene annotation strategies and found that different approaches and parameters often resulted in rather distinct sets of pseudogenes. We subsequently developed a consensus approach for annotating pseudogenes (derived from protein coding genes) in the ENCODE regions, resulting in 201 pseudogenes, two-thirds of which originated from retrotransposition. A survey of orthologs for these pseudogenes in 28 vertebrate genomes showed that a significant fraction (∼80%) of the processed pseudogenes are primate-specific sequences, highlighting the increasing retrotransposition activity in primates. Analysis of sequence conservation and variation also demonstrated that most pseudogenes evolve neutrally, and processed pseudogenes appear to have lost their coding potential immediately or soon after their emergence. In order to explore the functional implication of pseudogene prevalence, we have extensively examined the transcriptional activity of the ENCODE pseudogenes. We performed systematic series of pseudogene-specific RACE analyses. These, together with complementary evidence derived from tiling microarrays and high throughput sequencing, demonstrated that at least a fifth of the 201 pseudogenes are transcribed in one or more cell lines or tissues.
Resumo:
This paper presents several algorithms for joint estimation of the target number and state in a time-varying scenario. Building on the results presented in [1], which considers estimation of the target number only, we assume that not only the target number, but also their state evolution must be estimated. In this context, we extend to this new scenario the Rao-Blackwellization procedure of [1] to compute Bayes recursions, thus defining reduced-complexity solutions for the multi-target set estimator. A performance assessmentis finally given both in terms of Circular Position Error Probability - aimed at evaluating the accuracy of the estimated track - and in terms of Cardinality Error Probability, aimed at evaluating the reliability of the target number estimates.
Resumo:
Although both are fundamental terms in the humanities and social sciences, discourse and knowledge have seldom been explicitly related, and even less so in critical discourse studies. After a brief summary of what we know about these relationships in linguistics, psychology, epistemology and the social sciences, with special emphasis on the role of knowledge in the formation of mental models as a basis for discourse, I examine in more detail how a critical study of discourse and knowledge may be articulated in critical discourse studies. Thus, several areas of critical epistemic discourse analysis are identified, and then applied in a study of Tony Blair’s Iraq speech on March 18, 2003, in which he sought to legitimatize his decision to go to war in Iraq with George Bush. The analysis shows the various modes of how knowledge is managed and manipulated of all levels of discourse of this speech.
Resumo:
We consider the application of normal theory methods to the estimation and testing of a general type of multivariate regressionmodels with errors--in--variables, in the case where various data setsare merged into a single analysis and the observable variables deviatepossibly from normality. The various samples to be merged can differ on the set of observable variables available. We show that there is a convenient way to parameterize the model so that, despite the possiblenon--normality of the data, normal--theory methods yield correct inferencesfor the parameters of interest and for the goodness--of--fit test. Thetheory described encompasses both the functional and structural modelcases, and can be implemented using standard software for structuralequations models, such as LISREL, EQS, LISCOMP, among others. An illustration with Monte Carlo data is presented.
Resumo:
In the homogeneous case of one-dimensional objects, we show that any preference relation that is positive and homothetic can be represented by a quantitative utility function and unique bias. This bias may favor or disfavor the preference for an object. In the first case, preferences are complete but not transitive and an object may be preferred even when its utility is lower. In the second case, preferences are asymmetric and transitive but not negatively transitive and it may not be sufficient for an object to have a greater utility for be preferred. In this manner, the bias reflects the extent to which preferences depart from the maximization of a utility function.
Resumo:
Recent decisions by the Spanish national competition authority (TDC) mandate paymentsystems to include only two costs when setting their domestic multilateral interchange fees(MIF): a fixed processing cost and a variable cost for the risk of fraud. This artificiallowering of MIFs will not lower consumer prices, because of uncompetitive retailing; but itwill however lead to higher cardholders fees and, likely, new prices for point of saleterminals, delaying the development of the immature Spanish card market. Also, to the extent that increased cardholders fees do not offset the fall in MIFs revenue, the task of issuing new cards will be underpaid relatively to the task of acquiring new merchants, causing an imbalance between the two sides of the networks. Moreover, the pricing scheme arising from the decisions will cause unbundling and underprovision of those services whose costs are excluded. Indeed, the payment guarantee and the free funding period will tend to be removed from the package of services currently provided, to be either provided by third parties, by issuers for a separate fee, or not provided at all, especially to smaller and medium-sized merchants. Transaction services will also suffer the consequences that the TDC precludes pricing them in variable terms.