964 resultados para statistical framework
Resumo:
When heated to high temperatures, the behavior of matter changes dramatically. The standard model fields go through phase transitions, where the strongly interacting quarks and gluons are liberated from their confinement to hadrons, and the Higgs field condensate melts, restoring the electroweak symmetry. The theoretical framework for describing matter at these extreme conditions is thermal field theory, combining relativistic field theory and quantum statistical mechanics. For static observables the physics is simplified at very high temperatures, and an effective three-dimensional theory can be used instead of the full four-dimensional one via a method called dimensional reduction. In this thesis dimensional reduction is applied to two distinct problems, the pressure of electroweak theory and the screening masses of mesonic operators in quantum chromodynamics (QCD). The introductory part contains a brief review of finite-temperature field theory, dimensional reduction and the central results, while the details of the computations are contained in the original research papers. The electroweak pressure is shown to converge well to a value slightly below the ideal gas result, whereas the pressure of the full standard model is dominated by the QCD pressure with worse convergence properties. For the mesonic screening masses a small positive perturbative correction is found, and the interpretation of dimensional reduction on the fermionic sector is discussed.
Resumo:
This article presents and evaluates Quantum Inspired models of Target Activation using Cued-Target Recall Memory Modelling over multiple sources of Free Association data. Two components were evaluated: Whether Quantum Inspired models of Target Activation would provide a better framework than their classical psychological counterparts and how robust these models are across the different sources of Free Association data. In previous work, a formal model of cued-target recall did not exist and as such Target Activation was unable to be assessed directly. Further to that, the data source used was suspected of suffering from temporal and geographical bias. As a consequence, Target Activation was measured against cued-target recall data as an approximation of performance. Since then, a formal model of cued-target recall (PIER3) has been developed [10] with alternative sources of data also becoming available. This allowed us to directly model target activation in cued-target recall with human cued-target recall pairs and use multiply sources of Free Association Data. Featural Characteristics known to be important to Target Activation were measured for each of the data sources to identify any major differences that may explain variations in performance for each of the models. Each of the activation models were used in the PIER3 memory model for each of the data sources and was benchmarked against cued-target recall pairs provided by the University of South Florida (USF). Two methods where used to evaluate performance. The first involved measuring the divergence between the sets of results using the Kullback Leibler (KL) divergence with the second utilizing a previous statistical analysis of the errors [9]. Of the three sources of data, two were sourced from human subjects being the USF Free Association Norms and the University of Leuven (UL) Free Association Networks. The third was sourced from a new method put forward by Galea and Bruza, 2015 in which pseudo Free Association Networks (Corpus Based Association Networks - CANs) are built using co-occurrence statistics on large text corpus. It was found that the Quantum Inspired Models of Target Activation not only outperformed the classical psychological model but was more robust across a variety of data sources.
Resumo:
REDEFINE is a reconfigurable SoC architecture that provides a unique platform for high performance and low power computing by exploiting the synergistic interaction between coarse grain dynamic dataflow model of computation (to expose abundant parallelism in applications) and runtime composition of efficient compute structures (on the reconfigurable computation resources). We propose and study the throttling of execution in REDEFINE to maximize the architecture efficiency. A feature specific fast hybrid (mixed level) simulation framework for early in design phase study is developed and implemented to make the huge design space exploration practical. We do performance modeling in terms of selection of important performance criteria, ranking of the explored throttling schemes and investigate effectiveness of the design space exploration using statistical hypothesis testing. We find throttling schemes which give appreciable (24.8%) overall performance gain in the architecture and 37% resource usage gain in the throttling unit simultaneously.
Resumo:
The aim of the study was to find out how the consumption of the population in Finland became a target of social interest and production of statistical data in the early 20th century, and what efforts have been made to influence consumption with social policy measures at different times. Questions concerning consumption are examined through the practices employed in the compilation of statistics on it. The interpretation framework in the study is Michael Foucault s perspective of modern liberal government. This mode of government is typified by pursuit of efficiency and search of equilibrium between economic government and a government of the processes of life. It shows aspirations towards both integration and individualisation. The government is based on freedom practices. It also implies knowledge-based ways of conceptualising reality. Statistical data are of specific significance in this context. The connection between the government of consumption and the compilation of statistics on it is studied through the theoretical, socio-political and statistical conceptualisation of consumption. The research material consisted of Finnish and international documentation on the compilation of statistics on consumption, publications of social programmes, and reports of studies on consumption. The analysis of the material focused especially on the problematisations related to consumption found in these documents and on changes in them over history. There have been both clearly observable changes and as well as historical stratification and diversity in the rationalities and practices of consumption government during the 20th century. Consumption has been influenced by pluralistic government, based at different times and in varying ways on the logics of solidarity and markets. The difference between these is that in the former risks are prepared for collectively while in the latter risks are individualised. Despite the differences, the characteristic that is common to these logics is certain kind of contractuality. They are both permeated by the household logic which differs from them in that it is based on the normative and ethical demands imposed on an individual. There has been a clear interactive connection between statistical data and consumption government. Statistical practices have followed changes in the way consumption has been conceptualised in society. This has been reflected in the statistical phenomena of interest, concepts, classifications and indicators. New ways of compiling statistics have in their turn shaped perceptions of reality. Statistical data have also facilitated a variety of rational calculations with which the consequences of the population s consumption habits have been evaluated at the levels of economy at large and individuals.