952 resultados para Conformal Field Models in String Theory


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Gowers, dans son article sur les matrices quasi-aléatoires, étudie la question, posée par Babai et Sos, de l'existence d'une constante $c>0$ telle que tout groupe fini possède un sous-ensemble sans produit de taille supérieure ou égale a $c|G|$. En prouvant que, pour tout nombre premier $p$ assez grand, le groupe $PSL_2(\mathbb{F}_p)$ (d'ordre noté $n$) ne posséde aucun sous-ensemble sans produit de taille $c n^{8/9}$, il y répond par la négative. Nous allons considérer le probléme dans le cas des groupes compacts finis, et plus particuliérement des groupes profinis $SL_k(\mathbb{Z}_p)$ et $Sp_{2k}(\mathbb{Z}_p)$. La premiére partie de cette thése est dédiée à l'obtention de bornes inférieures et supérieures exponentielles pour la mesure suprémale des ensembles sans produit. La preuve nécessite d'établir préalablement une borne inférieure sur la dimension des représentations non-triviales des groupes finis $SL_k(\mathbb{Z}/(p^n\mathbb{Z}))$ et $Sp_{2k}(\mathbb{Z}/(p^n\mathbb{Z}))$. Notre théoréme prolonge le travail de Landazuri et Seitz, qui considérent le degré minimal des représentations pour les groupes de Chevalley sur les corps finis, tout en offrant une preuve plus simple que la leur. La seconde partie de la thése à trait à la théorie algébrique des nombres. Un polynome monogéne $f$ est un polynome unitaire irréductible à coefficients entiers qui endengre un corps de nombres monogéne. Pour un nombre premier $q$ donné, nous allons montrer, en utilisant le théoréme de densité de Tchebotariov, que la densité des nombres premiers $p$ tels que $t^q -p$ soit monogéne est supérieure ou égale à $(q-1)/q$. Nous allons également démontrer que, quand $q=3$, la densité des nombres premiers $p$ tels que $\mathbb{Q}(\sqrt[3]{p})$ soit non monogéne est supérieure ou égale à $1/9$.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cette thèse est principalement constituée de trois articles traitant des processus markoviens additifs, des processus de Lévy et d'applications en finance et en assurance. Le premier chapitre est une introduction aux processus markoviens additifs (PMA), et une présentation du problème de ruine et de notions fondamentales des mathématiques financières. Le deuxième chapitre est essentiellement l'article "Lévy Systems and the Time Value of Ruin for Markov Additive Processes" écrit en collaboration avec Manuel Morales et publié dans la revue European Actuarial Journal. Cet article étudie le problème de ruine pour un processus de risque markovien additif. Une identification de systèmes de Lévy est obtenue et utilisée pour donner une expression de l'espérance de la fonction de pénalité actualisée lorsque le PMA est un processus de Lévy avec changement de régimes. Celle-ci est une généralisation des résultats existant dans la littérature pour les processus de risque de Lévy et les processus de risque markoviens additifs avec sauts "phase-type". Le troisième chapitre contient l'article "On a Generalization of the Expected Discounted Penalty Function to Include Deficits at and Beyond Ruin" qui est soumis pour publication. Cet article présente une extension de l'espérance de la fonction de pénalité actualisée pour un processus subordinateur de risque perturbé par un mouvement brownien. Cette extension contient une série de fonctions escomptée éspérée des minima successives dus aux sauts du processus de risque après la ruine. Celle-ci a des applications importantes en gestion de risque et est utilisée pour déterminer la valeur espérée du capital d'injection actualisé. Finallement, le quatrième chapitre contient l'article "The Minimal entropy martingale measure (MEMM) for a Markov-modulated exponential Lévy model" écrit en collaboration avec Romuald Hervé Momeya et publié dans la revue Asia-Pacific Financial Market. Cet article présente de nouveaux résultats en lien avec le problème de l'incomplétude dans un marché financier où le processus de prix de l'actif risqué est décrit par un modèle exponentiel markovien additif. Ces résultats consistent à charactériser la mesure martingale satisfaisant le critère de l'entropie. Cette mesure est utilisée pour calculer le prix d'une option, ainsi que des portefeuilles de couverture dans un modèle exponentiel de Lévy avec changement de régimes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The thesis entitled Analysis of Some Stochastic Models in Inventories and Queues. This thesis is devoted to the study of some stochastic models in Inventories and Queues which are physically realizable, though complex. It contains a detailed analysis of the basic stochastic processes underlying these models. In this thesis, (s,S) inventory systems with nonidentically distributed interarrival demand times and random lead times, state dependent demands, varying ordering levels and perishable commodities with exponential life times have been studied. The queueing system of the type Ek/Ga,b/l with server vacations, service systems with single and batch services, queueing system with phase type arrival and service processes and finite capacity M/G/l queue when server going for vacation after serving a random number of customers are also analysed. The analogy between the queueing systems and inventory systems could be exploited in solving certain models. In vacation models, one important result is the stochastic decomposition property of the system size or waiting time. One can think of extending this to the transient case. In inventory theory, one can extend the present study to the case of multi-item, multi-echelon problems. The study of perishable inventory problem when the commodities have a general life time distribution would be a quite interesting problem. The analogy between the queueing systems and inventory systems could be exploited in solving certain models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The term reliability of an equipment or device is often meant to indicate the probability that it carries out the functions expected of it adequately or without failure and within specified performance limits at a given age for a desired mission time when put to use under the designated application and operating environmental stress. A broad classification of the approaches employed in relation to reliability studies can be made as probabilistic and deterministic, where the main interest in the former is to device tools and methods to identify the random mechanism governing the failure process through a proper statistical frame work, while the latter addresses the question of finding the causes of failure and steps to reduce individual failures thereby enhancing reliability. In the probabilistic attitude to which the present study subscribes to, the concept of life distribution, a mathematical idealisation that describes the failure times, is fundamental and a basic question a reliability analyst has to settle is the form of the life distribution. It is for no other reason that a major share of the literature on the mathematical theory of reliability is focussed on methods of arriving at reasonable models of failure times and in showing the failure patterns that induce such models. The application of the methodology of life time distributions is not confined to the assesment of endurance of equipments and systems only, but ranges over a wide variety of scientific investigations where the word life time may not refer to the length of life in the literal sense, but can be concieved in its most general form as a non-negative random variable. Thus the tools developed in connection with modelling life time data have found applications in other areas of research such as actuarial science, engineering, biomedical sciences, economics, extreme value theory etc.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We investigate thin films of cylinder-forming diblock copolymer confined between electrically charged parallel plates, using self-consistent-field theory ( SCFT) combined with an exact treatment for linear dielectric materials. Our study focuses on the competition between the surface interactions, which tend to orient cylinder domains parallel to the plates, and the electric field, which favors a perpendicular orientation. The effect of the electric field on the relative stability of the competing morphologies is demonstrated with equilibrium phase diagrams, calculated with the aid of a weak-field approximation. As hoped, modest electric fields are shown to have a significant stabilizing effect on perpendicular cylinders, particularly for thicker films. Our improved SCFT-based treatment removes most of the approximations implemented by previous approaches, thereby managing to resolve outstanding qualitative inconsistencies among different approximation schemes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Formal and analytical models that contractors can use to assess and price project risk at the tender stage have proliferated in recent years. However, they are rarely used in practice. Introducing more models would, therefore, not necessarily help. A better understanding is needed of how contractors arrive at a bid price in practice, and how, and in what circumstances, risk apportionment actually influences pricing levels. More than 60 proposed risk models for contractors that are published in journals were examined and classified. Then exploratory interviews with five UK contractors and documentary analyses on how contractors price work generally and risk specifically were carried out to help in comparing the propositions from the literature to what contractors actually do. No comprehensive literature on the real bidding processes used in practice was found, and there is no evidence that pricing is systematic. Hence, systematic risk and pricing models for contractors may have no justifiable basis. Contractors process their bids through certain tendering gateways. They acknowledge the risk that they should price. However, the final settlement depends on a set of complex, micro-economic factors. Hence, risk accountability may be smaller than its true cost to the contractor. Risk apportionment occurs at three stages of the whole bid-pricing process. However, analytical approaches tend not to incorporate this, although they could.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As part of the DAPPLE programme two large scale urban tracer experiments using multiple simultaneous releases of cyclic perfluoroalkanes from fixed location point sources was performed. The receptor concentrations along with relevant meteorological parameters measured are compared with a three screening dispersion models in order to best predict the decay of pollution sources with respect to distance. It is shown here that the simple dispersion models tested here can provide a reasonable upper bound estimate of the maximum concentrations measured with an empirical model derived from field observations and wind tunnel studies providing the best estimate. An indoor receptor was also used to assess indoor concentrations and their pertinence to commonly used evacuation procedures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We review and structure some of the mathematical and statistical models that have been developed over the past half century to grapple with theoretical and experimental questions about the stochastic development of aging over the life course. We suggest that the mathematical models are in large part addressing the problem of partitioning the randomness in aging: How does aging vary between individuals, and within an individual over the lifecourse? How much of the variation is inherently related to some qualities of the individual, and how much is entirely random? How much of the randomness is cumulative, and how much is merely short-term flutter? We propose that recent lines of statistical inquiry in survival analysis could usefully grapple with these questions, all the more so if they were more explicitly linked to the relevant mathematical and biological models of aging. To this end, we describe points of contact among the various lines of mathematical and statistical research. We suggest some directions for future work, including the exploration of information-theoretic measures for evaluating components of stochastic models as the basis for analyzing experiments and anchoring theoretical discussions of aging.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Progress in functional neuroimaging of the brain increasingly relies on the integration of data from complementary imaging modalities in order to improve spatiotemporal resolution and interpretability. However, the usefulness of merely statistical combinations is limited, since neural signal sources differ between modalities and are related non-trivially. We demonstrate here that a mean field model of brain activity can simultaneously predict EEG and fMRI BOLD with proper signal generation and expression. Simulations are shown using a realistic head model based on structural MRI, which includes both dense short-range background connectivity and long-range specific connectivity between brain regions. The distribution of modeled neural masses is comparable to the spatial resolution of fMRI BOLD, and the temporal resolution of the modeled dynamics, importantly including activity conduction, matches the fastest known EEG phenomena. The creation of a cortical mean field model with anatomically sound geometry, extensive connectivity, and proper signal expression is an important first step towards the model-based integration of multimodal neuroimages.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Second language acquisition researchers often face particular challenges when attempting to generalize study findings to the wider learner population. For example, language learners constitute a heterogeneous group, and it is not always clear how a study’s findings may generalize to other individuals who may differ in terms of language background and proficiency, among many other factors. In this paper, we provide an overview of how mixed-effects models can be used to help overcome these and other issues in the field of second language acquisition. We provide an overview of the benefits of mixed-effects models and a practical example of how mixed-effects analyses can be conducted. Mixed-effects models provide second language researchers with a powerful statistical tool in the analysis of a variety of different types of data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The issue of information sharing and exchanging is one of the most important issues in the areas of artificial intelligence and knowledge-based systems (KBSs), or even in the broader areas of computer and information technology. This paper deals with a special case of this issue by carrying out a case study of information sharing between two well-known heterogeneous uncertain reasoning models: the certainty factor model and the subjective Bayesian method. More precisely, this paper discovers a family of exactly isomorphic transformations between these two uncertain reasoning models. More interestingly, among isomorphic transformation functions in this family, different ones can handle different degrees to which a domain expert is positive or negative when performing such a transformation task. The direct motivation of the investigation lies in a realistic consideration. In the past, expert systems exploited mainly these two models to deal with uncertainties. In other words, a lot of stand-alone expert systems which use the two uncertain reasoning models are available. If there is a reasonable transformation mechanism between these two uncertain reasoning models, we can use the Internet to couple these pre-existing expert systems together so that the integrated systems are able to exchange and share useful information with each other, thereby improving their performance through cooperation. Also, the issue of transformation between heterogeneous uncertain reasoning models is significant in the research area of multi-agent systems because different agents in a multi-agent system could employ different expert systems with heterogeneous uncertain reasonings for their action selections and the information sharing and exchanging is unavoidable between different agents. In addition, we make clear the relationship between the certainty factor model and probability theory.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The attainment of high grades on the Victorian Certificate of Education (VCE) is critical to the future study and employment prospects of many Australian adolescents. Thus it is important to understand the factors that contribute to performance in the VCE. The aims of this study were twofold: the main aim was to test competing models of academic performance, subsuming a range of situational and dispositional variables based on a) self-efficacy theory, b) target and purpose goals, c) cognitive skills and self-regulatory strategies, and d) positive psychology. These models were each tested in terms of English performance and mathematics performance as these units contribute proportionally the most to overall VCE scores. In order to study whether pressures peculiar to the VCE impact on performance, the competing models were tested in a sample of Victorian students prior to the VCE (year 10) and then during the VCE (year 11). A preliminary study was conducted in order to develop and test four scales required for use in the major study, using an independent sample of 302 year nine students. The results indicated that these new scales were psychometrically reliable and valid. Three-hundred and seven Australian students participated in the year 10 and 11 study. These students were successively asked to provide their final years 9, 10 and 11 English and mathematics grades at times one, three and five and to complete a series of questionnaires at times two and four. Results of the year 10 study indicated that models based on self-efficacy theory were the best predictors of both English and mathematics performance, with high past grades, high self-efficacy and low anxiety contributing most to performance. While the year 10 self-efficacy models, target goal models, positive psychology models, self-regulatory models and cognitive skill based models were each robust in the sample in year 11, a substantial increase in explained variance was observed from year 10 to year 11 in the purpose goal models. Results indicated that students’ mastery goals and their performance-approach goals became substantially more predictive in the VCE than they were prior to the VCE. This result can be taken to suggest that these students responded in very instrumental ways to the pressures, and importance, of their VCE. An integrated model based on a combination of the variables from the competing models was also tested in the VCE. Results showed that these models were comparable, both in English and mathematics, to the self-efficacy models, but explained less variance than the purpose goal models. Thus in terms of parsimony the integrated models were not preferred. The implications of these results in terms of teaching practices and school counseling practices are discussed. It is recommended that students be encouraged to maintain a positive outlook in relation to their schoolwork and that they be encouraged to set their VCE goals in terms of a combination of self-referenced (mastery) and other-referenced (performance-approach) goals.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper reports on a study into pre-service teachers’ perceptions about their professional development during practicum. The study examined to what extent, and how effectively, one group of pre-service teachers was able to integrate theory and practice during a three-week practicum in the first year of their degree. Data for this mixed methods study were drawn from one cohort of first-year students undertaking the Master of Teaching (MTeach), a graduate-level entry program in the Faculty of Education at an urban Australian university. Although there is a strong field of literature around the practicum in pre-service teacher education, there has been a limited focus on how pre-service teachers themselves perceive their development during this learning period. Further, despite widespread and longstanding acknowledgement of the “gap” between theory and practice in teacher education, there is still more to learn about how well the practicum enables an integration of these two dimensions of teacher preparation. In presenting three major findings of the study, this paper goes some way in addressing these shortcomings in the literature. First, opportunities to integrate theory and practice were varied, with many participants reporting supervision and scheduling issues as impacting on their capacity to effectively enact theory in practice. Second, participants’ privileging of theory over practice, identified previously in the literature as commonly characteristic of the pre-service teacher, was found in this study to be particularly prevalent during practicum. Third, participants overwhelmingly supported the notion of linking university coursework assessment to the practicum as a means of bridging the gap between, on the one hand, the university and the school and, on the other hand, theory and practice. The discussion and consideration of findings such as those reported in this paper are pertinent and timely, given the ratification of both the National Professional Standards for Teachers and the Initial Teacher Education Program Standards by the Australian Federal Government earlier this year. Within a number of the seven Professional Standards, graduate teachers are required to demonstrate knowledge and skills associated with both the theory and practice of teaching and with their effective integration in the classroom. To be nationally accredited, pre-service teacher education programs must provide evidence of enabling pre-service teachers to acquire such knowledge and skills.