223 resultados para Algorithmic Graph Theory
Resumo:
First discussion on compositional data analysis is attributable to Karl Pearson, in 1897. However, notwithstanding the recent developments on algebraic structure of the simplex, more than twenty years after Aitchison’s idea of log-transformations of closed data, scientific literature is again full of statistical treatments of this type of data by using traditional methodologies. This is particularly true in environmental geochemistry where besides the problem of the closure, the spatial structure (dependence) of the data have to be considered. In this work we propose the use of log-contrast values, obtained by asimplicial principal component analysis, as LQGLFDWRUV of given environmental conditions. The investigation of the log-constrast frequency distributions allows pointing out the statistical laws able togenerate the values and to govern their variability. The changes, if compared, for example, with the mean values of the random variables assumed as models, or other reference parameters, allow definingmonitors to be used to assess the extent of possible environmental contamination. Case study on running and ground waters from Chiavenna Valley (Northern Italy) by using Na+, K+, Ca2+, Mg2+, HCO3-, SO4 2- and Cl- concentrations will be illustrated
Resumo:
This document contains a report and summary of the field research activities in a rural community of rice farmers in Kampot province, Cambodia in 2011, which I conducted within the context of my PhD research at ICTA-UAB (Institute of Environmental Science and Technology, Autonomous University of Barcelona, Spain). The purpose of the field research was to gather data for a MuSIASEM analysis (Multi-Scale Integrated Analysis of Societal and Ecosystem Metabolism) at the village and household level, in order to analyze the multidimensional challenges that small farmers may face nowadays within the context of global rural change and declining access to land. While the literature on MuSIASEM offers a great variety of theoretical explanations and practical applications, there is little information available for students regarding the practical steps required for doing a MuSIASEM analysis at the local level. Within this context, this report offers not only a documentation of the field research design and data collection methods, but further provides a general overview on some organizational and preparative aspects, including some personal reflections, that one may face when preparing and conducting field research for MuSIASEM analysis. In summary, this document thus serves three objectives: (i) to assure methodological transparency for the future work, based on the collected data during field research, (ii) to share my personal experience on the preparative and practical steps required for field research and data collection for a MuSIASEM analysis at the local level, and (iii) to make available for the further interested reader some more detailed background information on the case study village.
Resumo:
The classical wave-of-advance model of the neolithic transition (i.e., the shift from hunter-gatherer to agricultural economies) is based on Fisher's reaction-diffusion equation. Here we present an extension of Einstein's approach to Fickian diffusion, incorporating reaction terms. On this basis we show that second-order terms in the reaction-diffusion equation, which have been neglected up to now, are not in fact negligible but can lead to important corrections. The resulting time-delayed model agrees quite well with observations
Resumo:
The performance of the SAOP potential for the calculation of NMR chemical shifts was evaluated. SAOP results show considerable improvement with respect to previous potentials, like VWN or BP86, at least for the carbon, nitrogen, oxygen, and fluorine chemical shifts. Furthermore, a few NMR calculations carried out on third period atoms (S, P, and Cl) improved when using the SAOP potential
Resumo:
The basis set superposition error-free second-order MØller-Plesset perturbation theory of intermolecular interactions was studied. The difficulties of the counterpoise (CP) correction in open-shell systems were also discussed. The calculations were performed by a program which was used for testing the new variants of the theory. It was shown that the CP correction for the diabatic surfaces should be preferred to the adiabatic ones
Resumo:
A comparative systematic study of the CrO2F2 compound has been performed using different conventional ab initio methodologies and density functional procedures. Two points have been analyzed: first, the accuracy of results yielded by each method under study, and second, the computational cost required to reach such results. Weighing up both aspects, density functional theory has been found to be more appropriate than the Hartree-Fock (HF) and the analyzed post-HF methods. Hence, the structural characterization and spectroscopic elucidation of the full CrO2X2 series (X=F,Cl,Br,I) has been done at this level of theory. Emphasis has been given to the unknown CrO2I2 species, and specially to the UV/visible spectra of all four compounds. Furthermore, a topological analysis in terms of charge density distributions has revealed why the valence shell electron pair repulsion model fails in predicting the molecular shape of such CrO2X2 complexes
Resumo:
A conceptually new approach is introduced for the decomposition of the molecular energy calculated at the density functional theory level of theory into sum of one- and two-atomic energy components, and is realized in the "fuzzy atoms" framework. (Fuzzy atoms mean that the three-dimensional physical space is divided into atomic regions having no sharp boundaries but exhibiting a continuous transition from one to another.) The new scheme uses the new concept of "bond order density" to calculate the diatomic exchange energy components and gives them unexpectedly close to the values calculated by the exact (Hartree-Fock) exchange for the same Kohn-Sham orbitals
Resumo:
The empirical evidence testing the validity of the rational partisan theory (RPT) has been mixed. In this article, we argue that the inclusion of other macroeconomic policies and the presence of an independent central bank can partly contribute to explain this inconclusiveness. This article expands Alesina s (1987) RPT model to include an extra policy and an independent central bank. With these extensions, the implications of RPT are altered signi ficantly. In particular, when the central bank is more concerned about output than public spending (an assumption made by many papers in this literature), then the direct relationship between in flation and output derived in Alesina (1987) never holds. Keywords: central bank, conservativeness, political uncertainty. JEL Classi fication: E58, E63.
Resumo:
We present building blocks for algorithms for the efficient reduction of square factor, i.e. direct repetitions in strings. So the basic problem is this: given a string, compute all strings that can be obtained by reducing factors of the form zz to z. Two types of algorithms are treated: an offline algorithm is one that can compute a data structure on the given string in advance before the actual search for the square begins; in contrast, online algorithms receive all input only at the time when a request is made. For offline algorithms we treat the following problem: Let u and w be two strings such that w is obtained from u by reducing a square factor zz to only z. If we further are given the suffix table of u, how can we derive the suffix table for w without computing it from scratch? As the suffix table plays a key role in online algorithms for the detection of squares in a string, this derivation can make the iterated reduction of squares more efficient. On the other hand, we also show how a suffix array, used for the offline detection of squares, can be adapted to the new string resulting from the deletion of a square. Because the deletion is a very local change, this adaption is more eficient than the computation of the new suffix array from scratch.
Resumo:
A new graph-based construction of generalized low density codes (GLD-Tanner) with binary BCH constituents is described. The proposed family of GLD codes is optimal on block erasure channels and quasi-optimal on block fading channels. Optimality is considered in the outage probability sense. Aclassical GLD code for ergodic channels (e.g., the AWGN channel,the i.i.d. Rayleigh fading channel, and the i.i.d. binary erasure channel) is built by connecting bitnodes and subcode nodes via a unique random edge permutation. In the proposed construction of full-diversity GLD codes (referred to as root GLD), bitnodes are divided into 4 classes, subcodes are divided into 2 classes, and finally both sides of the Tanner graph are linked via 4 random edge permutations. The study focuses on non-ergodic channels with two states and can be easily extended to channels with 3 states or more.
Resumo:
The paper presents a competence-based instructional design system and a way to provide a personalization of navigation in the course content. The navigation aid tool builds on the competence graph and the student model, which includes the elements of uncertainty in the assessment of students. An individualized navigation graph is constructed for each student, suggesting the competences the student is more prepared to study. We use fuzzy set theory for dealing with uncertainty. The marks of the assessment tests are transformed into linguistic terms and used for assigning values to linguistic variables. For each competence, the level of difficulty and the level of knowing its prerequisites are calculated based on the assessment marks. Using these linguistic variables and approximate reasoning (fuzzy IF-THEN rules), a crisp category is assigned to each competence regarding its level of recommendation.
Resumo:
Aquest treball elabora una proposta de traducció per al doblatge del capítol pilot de The Big Bang Theory, que combina llenguatge col•loquial i llenguatge científic.L’objectiu és doble: elaborar un llenguatge col•loquial creïble però a la vegada genuí i emprar els equivalents catalans adequats per als termes científics originals.
Resumo:
Returns to scale to capital and the strength of capital externalities play a key role for the empirical predictions and policy implications of different growth theories. We show that both can be identified with individual wage data and implement our approach at the city-level using US Census data on individuals in 173 cities for 1970, 1980, and 1990. Estimation takes into account fixed effects, endogeneity of capital accumulation, and measurement error. We find no evidence for human or physical capital externalities and decreasing aggregate returns to capital. Returns to scale to physical and human capital are around 80 percent. We also find strong complementarities between human capital and labor and substantial total employment externalities.