64 resultados para Export unit value indices
Resumo:
The Hardy-Weinberg law, formulated about 100 years ago, states that under certainassumptions, the three genotypes AA, AB and BB at a bi-allelic locus are expected to occur inthe proportions p2, 2pq, and q2 respectively, where p is the allele frequency of A, and q = 1-p.There are many statistical tests being used to check whether empirical marker data obeys theHardy-Weinberg principle. Among these are the classical xi-square test (with or withoutcontinuity correction), the likelihood ratio test, Fisher's Exact test, and exact tests in combinationwith Monte Carlo and Markov Chain algorithms. Tests for Hardy-Weinberg equilibrium (HWE)are numerical in nature, requiring the computation of a test statistic and a p-value.There is however, ample space for the use of graphics in HWE tests, in particular for the ternaryplot. Nowadays, many genetical studies are using genetical markers known as SingleNucleotide Polymorphisms (SNPs). SNP data comes in the form of counts, but from the countsone typically computes genotype frequencies and allele frequencies. These frequencies satisfythe unit-sum constraint, and their analysis therefore falls within the realm of compositional dataanalysis (Aitchison, 1986). SNPs are usually bi-allelic, which implies that the genotypefrequencies can be adequately represented in a ternary plot. Compositions that are in exactHWE describe a parabola in the ternary plot. Compositions for which HWE cannot be rejected ina statistical test are typically “close" to the parabola, whereas compositions that differsignificantly from HWE are “far". By rewriting the statistics used to test for HWE in terms ofheterozygote frequencies, acceptance regions for HWE can be obtained that can be depicted inthe ternary plot. This way, compositions can be tested for HWE purely on the basis of theirposition in the ternary plot (Graffelman & Morales, 2008). This leads to nice graphicalrepresentations where large numbers of SNPs can be tested for HWE in a single graph. Severalexamples of graphical tests for HWE (implemented in R software), will be shown, using SNPdata from different human populations
Resumo:
Es mostra que, gracies a una extensió en la definició dels Índexs Moleculars Topològics, s'arriba a la formulació d'índexs relacionats amb la teoria de la Semblança Molecular Quàntica. Es posa de manifest la connexió entre les dues metodologies: es revela que un marc de treball teòric sòlidament fonamentat sobre la teoria de la Mecànica Quàntica es pot connectar amb una de les tècniques més antigues relacionades amb els estudis de QSPR. Es mostren els resultats per a dos casos d'exemple d'aplicació d'ambdues metodologies
Resumo:
Developments in the statistical analysis of compositional data over the last twodecades have made possible a much deeper exploration of the nature of variability,and the possible processes associated with compositional data sets from manydisciplines. In this paper we concentrate on geochemical data sets. First we explainhow hypotheses of compositional variability may be formulated within the naturalsample space, the unit simplex, including useful hypotheses of subcompositionaldiscrimination and specific perturbational change. Then we develop through standardmethodology, such as generalised likelihood ratio tests, statistical tools to allow thesystematic investigation of a complete lattice of such hypotheses. Some of these tests are simple adaptations of existing multivariate tests but others require specialconstruction. We comment on the use of graphical methods in compositional dataanalysis and on the ordination of specimens. The recent development of the conceptof compositional processes is then explained together with the necessary tools for astaying- in-the-simplex approach, namely compositional singular value decompositions. All these statistical techniques are illustrated for a substantial compositional data set, consisting of 209 major-oxide and rare-element compositions of metamorphosed limestones from the Northeast and Central Highlands of Scotland.Finally we point out a number of unresolved problems in the statistical analysis ofcompositional processes
Resumo:
This paper presents an application of the Multi-Scale Integrated Analysis of Societal and Ecosystem Metabolism (MuSIASEM) approach to the estimation of quantities of Gross Value Added (GVA) referring to economic entities defined at different scales of study. The method first estimates benchmark values of the pace of GVA generation per hour of labour across economic sectors. These values are estimated as intensive variables –e.g. €/hour– by dividing the various sectorial GVA of the country (expressed in € per year) by the hours of paid work in that same sector per year. This assessment is obtained using data referring to national statistics (top down information referring to the national level). Then, the approach uses bottom-up information (the number of hours of paid work in the various economic sectors of an economic entity –e.g. a city or a province– operating within the country) to estimate the amount of GVA produced by that entity. This estimate is obtained by multiplying the number of hours of work in each sector in the economic entity by the benchmark value of GVA generation per hour of work of that particular sector (national average). This method is applied and tested on two different socio-economic systems: (i) Catalonia (considered level n) and Barcelona (considered level n-1); and (ii) the region of Lima (considered level n) and Lima Metropolitan Area (considered level n-1). In both cases, the GVA per year of the local economic entity –Barcelona and Lima Metropolitan Area – is estimated and the resulting value is compared with GVA data provided by statistical offices. The empirical analysis seems to validate the approach, even though the case of Lima Metropolitan Area indicates a need for additional care when dealing with the estimate of GVA in primary sectors (agriculture and mining).
Resumo:
A method to estimate an extreme quantile that requires no distributional assumptions is presented. The approach is based on transformed kernel estimation of the cumulative distribution function (cdf). The proposed method consists of a double transformation kernel estimation. We derive optimal bandwidth selection methods that have a direct expression for the smoothing parameter. The bandwidth can accommodate to the given quantile level. The procedure is useful for large data sets and improves quantile estimation compared to other methods in heavy tailed distributions. Implementation is straightforward and R programs are available.
Resumo:
Encara falta per fer possible una transformació estratègica d'Europa del sistema d'energia, però el que és de la mateixa importància com a objectius a llarg termini de la FER i Reduccions de GEH són vinculants i forts objectius d'eficiència energètica, no només per 2020, però també per al 2030, 2040 i 2050, com aquesta força ajudaria a fixar l'augment de les energies renovables en el total d'energia consum i per reduir el total Emissions de GEH d'Europa en general, i les del sector de l'energia en particular, encara sent un dels majors emissors de gasos d'efecte hivernacle de tots els sectors. La refosa Directiva, prevista per 2011/12 ha de ser un bones finestres d'oportunitat per finalment establir objectius vinculants d'eficiència energètica, l'únic pilar que encara falta en la força energia interdependents i estratègia sobre el clima de la UE, basat en la reducció de gasos d'efecte hivernacle i i l'eficiència energètica.
Resumo:
This paper investigates the role of learning by private agents and the central bank (two-sided learning) in a New Keynesian framework in which both sides of the economy have asymmetric and imperfect knowledge about the true data generating process. We assume that all agents employ the data that they observe (which may be distinct for different sets of agents) to form beliefs about unknown aspects of the true model of the economy, use their beliefs to decide on actions, and revise these beliefs through a statistical learning algorithm as new information becomes available. We study the short-run dynamics of our model and derive its policy recommendations, particularly with respect to central bank communications. We demonstrate that two-sided learning can generate substantial increases in volatility and persistence, and alter the behavior of the variables in the model in a signifficant way. Our simulations do not converge to a symmetric rational expectations equilibrium and we highlight one source that invalidates the convergence results of Marcet and Sargent (1989). Finally, we identify a novel aspect of central bank communication in models of learning: communication can be harmful if the central bank's model is substantially mis-specified
Resumo:
En el marc de la recerca sobre identitats i nova ciutadania, l’objecte d’aquest treball és reconèixer elements clau dels processos d’identificació nacional a Catalunya en la població nouvinguda, i identificar els factors que n'afavoreixen la seva vinculació comunitària. La recerca s'ha desenvolupat en un primer bloc analitzant la perspectiva sociològica entorn a la idea d’identitat, així com l’evolució del discurs identitari a Catalunya, a fi de continuar desenvolupant amb profunditat el debat sobre identitat i nació en relació al fet migratori i la diversitat social existent. En un segon bloc d’anàlisi, la recerca fixa la mirada a una dotzena de casos treballats a través d’històries de vida de diverses realitats migratòries del nostre país corresponents a les diverses onades del segle XX, tant d’immigració espanyola com d'immigració de fora de l'Estat espanyol. La voluntat d’aquest segon bloc és extreure experiències socials d’interpretacions individuals que assenyalin quins elements permeten o dificulten la mobilitat social, quines experiències tenen valor identitari i de quin tipus, i quins fenòmens esdevenen rellevants en la configuració d’espais de referència i identificació nacional a nivell individual. Finalment s’incorpora un apartat d’entrevistes en profunditat a diversos actors socials i polítics rellevants a fi de reconèixer aquells elements que centren el discurs de la “nova cultura pública comuna”. L’objectiu és interrelacionar el discurs polític i filosòfic actual amb l’experiència biogràfica de les persones, assenyalant aquells elements que han estat significatius per a la seva identitat com a catalans i catalanes o, ans al contrari, que no han afavorit una situació en aquests termes. En aquest sentit la pregunta que vincula aquest espai de reflexió és: Quins elements afavoreixen la identificació nacional englobant la diversitat social existent i quins mecanismes d'adhesió hi podrien funcionar?
Resumo:
When one wishes to implement public policies, there is a previous need of comparing different actions and valuating and evaluating them to assess their social attractiveness. Recently the concept of well-being has been proposed as a multidimensional proxy for measuring societal prosperity and progress; a key research topic is then on how we can measure and evaluate this plurality of dimensions for policy decisions. This paper defends the thesis articulated in the following points: 1. Different metrics are linked to different objectives and values. To use only one measurement unit (on the grounds of the so-called commensurability principle) for incorporating a plurality of dimensions, objectives and values, implies reductionism necessarily. 2. Point 1) can be proven as a matter of formal logic by drawing on the work of Geach about moral philosophy. This theoretical demonstration is an original contribution of this article. Here the distinction between predicative and attributive adjectives is formalised and definitions are provided. Predicative adjectives are further distinguished into absolute and relative ones. The new concepts of set commensurability and rod commensurability are introduced too. 3. The existence of a plurality of social actors, with interest in the policy being assessed, causes that social decisions involve multiple types of values, of which economic efficiency is only one. Therefore it is misleading to make social decisions based only on that one value. 4. Weak comparability of values, which is grounded on incommensurability, is proved to be the main methodological foundation of policy evaluation in the framework of well-being economics. Incommensurability does not imply incomparability; on the contrary incommensurability is the only rational way to compare societal options under a plurality of policy objectives. 5. Weak comparability can be implemented by using multi-criteria evaluation, which is a formal framework for applied consequentialism under incommensurability. Social Multi-Criteria Evaluation, in particular, allows considering both technical and social incommensurabilities simultaneously.
Resumo:
One of the strategies of Universitat Pompeu Fabra to support Quality Learning has been the creation of Units for the Support of Teaching Quality and Innovation within each faculty. In the seminar we will present the role and activities of the Polytechnic School Unit in charge or coordinating the efforts towards quality learning in the Information and Communication Technologies (ICT) Engineering Studies. We will also discuss how these activities are informed to relevant academic stakeholders.
Resumo:
The spectral efficiency achievable with joint processing of pilot and data symbol observations is compared with that achievable through the conventional (separate) approach of first estimating the channel on the basis of the pilot symbols alone, and subsequently detecting the datasymbols. Studied on the basis of a mutual information lower bound, joint processing is found to provide a non-negligible advantage relative to separate processing, particularly for fast fading. It is shown that, regardless of the fading rate, only a very small number of pilot symbols (at most one per transmit antenna and per channel coherence interval) shouldbe transmitted if joint processing is allowed.
Resumo:
The aim of this paper is to examine the pros and cons of book and fair value accounting from the perspective of the theory of banking. We consider the implications of the two accounting methods in an overlapping generations environment. As observed by Allen and Gale(1997), in an overlapping generation model, banks have a role as intergenerational connectors as they allow for intertemporal smoothing. Our main result is that when dividends depend on profits, book value ex ante dominates fair value, as it provides better intertemporal smoothing. This is in contrast with the standard view that states that, fair value yields a better allocation as it reflects the real opportunity cost of assets. Banking regulation play an important role by providing the right incentives for banks to smooth intertemporal consumption whereas market discipline improves intratemporal efficiency.
Resumo:
We present a polyhedral framework for establishing general structural properties on optimal solutions of stochastic scheduling problems, where multiple job classes vie for service resources: the existence of an optimal priority policy in a given family, characterized by a greedoid (whose feasible class subsets may receive higher priority), where optimal priorities are determined by class-ranking indices, under restricted linear performance objectives (partial indexability). This framework extends that of Bertsimas and Niño-Mora (1996), which explained the optimality of priority-index policies under all linear objectives (general indexability). We show that, if performance measures satisfy partial conservation laws (with respect to the greedoid), which extend previous generalized conservation laws, then the problem admits a strong LP relaxation over a so-called extended greedoid polytope, which has strong structural and algorithmic properties. We present an adaptive-greedy algorithm (which extends Klimov's) taking as input the linear objective coefficients, which (1) determines whether the optimal LP solution is achievable by a policy in the given family; and (2) if so, computes a set of class-ranking indices that characterize optimal priority policies in the family. In the special case of project scheduling, we show that, under additional conditions, the optimal indices can be computed separately for each project (index decomposition). We further apply the framework to the important restless bandit model (two-action Markov decision chains), obtaining new index policies, that extend Whittle's (1988), and simple sufficient conditions for their validity. These results highlight the power of polyhedral methods (the so-called achievable region approach) in dynamic and stochastic optimization.
Resumo:
Our task in this paper is to analyze the organization of trading in the era of quantitative finance. To do so, we conduct an ethnography of arbitrage, the trading strategy that best exemplifies finance in the wake of the quantitative revolution. In contrast to value and momentum investing, we argue, arbitrage involves an art of association-the construction of equivalence (comparability) of properties across different assets. In place of essential or relational characteristics, the peculiar valuation that takes place in arbitrage is based on an operation that makes something the measure of something else-associating securities to each other. The process of recognizing opportunities and the practices of making novel associations are shaped by the specific socio-spatial and socio-technical configurations of the trading room. Calculation is distributed across persons and instruments as the trading room organizes interaction among diverse principles of valuation.
Weak and Strong Altruism in Trait Groups: Reproductive Suicide, Personal Fitness, and Expected Value
Resumo:
A simple variant of trait group selection, employing predators as the mechanism underlying group selection, supports contingent reproductive suicide as altruism (i.e., behavior lowering personal fitness while augmenting that of another) without kin assortment. The contingent suicidal type may either saturate the population or be polymorphic with a type avoiding suicide, depending on parameters. In addition to contingent suicide, this randomly assorting morph may also exhibit continuously expressed strong altruism (sensu Wilson 1979) usually thought restricted to kin selection. The model will not, however, support a sterile worker caste as such, where sterility occurs before life history events associated with effective altruism; reproductive suicide must remain fundamentally contingent (facultative sensu West Eberhard 1987; Myles 1988) under random assortment. The continuously expressed strong altruism supported by the model may be reinterpreted as probability of arbitrarily committing reproductive suicide, without benefit for another; such arbitrary suicide (a "load" on "adaptive" suicide) is viable only under a more restricted parameter space relative to the necessarily concomitant adaptive contingent suicide.