7 resultados para Minimal Supersymmetric Standard Model (MSSM)
em Doria (National Library of Finland DSpace Services) - National Library of Finland, Finland
Resumo:
The properties and cosmological importance of a class of non-topological solitons, Q-balls, are studied. Aspects of Q-ball solutions and Q-ball cosmology discussed in the literature are reviewed. Q-balls are particularly considered in the Minimal Supersymmetric Standard Model with supersymmetry broken by a hidden sector mechanism mediated by either gravity or gauge interactions. Q-ball profiles, charge-energy relations and evaporation rates for realistic Q-ball profiles are calculated for general polynomial potentials and for the gravity mediated scenario. In all of the cases, the evaporation rates are found to increase with decreasing charge. Q-ball collisions are studied by numerical means in the two supersymmetry breaking scenarios. It is noted that the collision processes can be divided into three types: fusion, charge transfer and elastic scattering. Cross-sections are calculated for the different types of processes in the different scenarios. The formation of Q-balls from the fragmentation of the Aflieck-Dine -condensate is studied by numerical and analytical means. The charge distribution is found to depend strongly on the initial energy-charge ratio of the condensate. The final state is typically noted to consist of Q- and anti-Q-balls in a state of maximum entropy. By studying the relaxation of excited Q-balls the rate at which excess energy can be emitted is calculated in the gravity mediated scenario. The Q-ball is also found to withstand excess energy well without significant charge loss. The possible cosmological consequences of these Q-ball properties are discussed.
Resumo:
The Standard Model of particle physics is currently the best description of fundamental particles and their interactions. All particles save the Higgs boson have been observed in particle accelerator experiments over the years. Despite the predictive power the Standard Model there are many phenomena that the scenario does not predict or explain. Among the most prominent dilemmas is matter-antimatter asymmetry, and much effort has been made in formulating scenarios that accurately predict the correct amount of matter-antimatter asymmetry in the universe. One of the most appealing explanations is baryogenesis via leptogenesis which not only serves as a mechanism of producing excess matter over antimatter but can also explain why neutrinos have very small non-zero masses. Interesting leptogenesis scenarios arise when other possible candidates of theories beyond the Standard Model are brought into the picture. In this thesis, we have studied leptogenesis in an extra dimensional framework and in a modified version of supersymmetric Standard Model. The first chapters of this thesis introduce the standard cosmological model, observations made on the photon to baryon ratio and necessary preconditions for successful baryogenesis. Baryogenesis via leptogenesis is then introduced and its connection to neutrino physics is illuminated. The final chapters concentrate on extra dimensional theories and supersymmetric models and their ability to accommodate leptogenesis. There, the results of our research are also presented.
Resumo:
With the shift towards many-core computer architectures, dataflow programming has been proposed as one potential solution for producing software that scales to a varying number of processor cores. Programming for parallel architectures is considered difficult as the current popular programming languages are inherently sequential and introducing parallelism is typically up to the programmer. Dataflow, however, is inherently parallel, describing an application as a directed graph, where nodes represent calculations and edges represent a data dependency in form of a queue. These queues are the only allowed communication between the nodes, making the dependencies between the nodes explicit and thereby also the parallelism. Once a node have the su cient inputs available, the node can, independently of any other node, perform calculations, consume inputs, and produce outputs. Data ow models have existed for several decades and have become popular for describing signal processing applications as the graph representation is a very natural representation within this eld. Digital lters are typically described with boxes and arrows also in textbooks. Data ow is also becoming more interesting in other domains, and in principle, any application working on an information stream ts the dataflow paradigm. Such applications are, among others, network protocols, cryptography, and multimedia applications. As an example, the MPEG group standardized a dataflow language called RVC-CAL to be use within reconfigurable video coding. Describing a video coder as a data ow network instead of with conventional programming languages, makes the coder more readable as it describes how the video dataflows through the different coding tools. While dataflow provides an intuitive representation for many applications, it also introduces some new problems that need to be solved in order for data ow to be more widely used. The explicit parallelism of a dataflow program is descriptive and enables an improved utilization of available processing units, however, the independent nodes also implies that some kind of scheduling is required. The need for efficient scheduling becomes even more evident when the number of nodes is larger than the number of processing units and several nodes are running concurrently on one processor core. There exist several data ow models of computation, with different trade-offs between expressiveness and analyzability. These vary from rather restricted but statically schedulable, with minimal scheduling overhead, to dynamic where each ring requires a ring rule to evaluated. The model used in this work, namely RVC-CAL, is a very expressive language, and in the general case it requires dynamic scheduling, however, the strong encapsulation of dataflow nodes enables analysis and the scheduling overhead can be reduced by using quasi-static, or piecewise static, scheduling techniques. The scheduling problem is concerned with nding the few scheduling decisions that must be run-time, while most decisions are pre-calculated. The result is then an, as small as possible, set of static schedules that are dynamically scheduled. To identify these dynamic decisions and to find the concrete schedules, this thesis shows how quasi-static scheduling can be represented as a model checking problem. This involves identifying the relevant information to generate a minimal but complete model to be used for model checking. The model must describe everything that may affect scheduling of the application while omitting everything else in order to avoid state space explosion. This kind of simplification is necessary to make the state space analysis feasible. For the model checker to nd the actual schedules, a set of scheduling strategies are de ned which are able to produce quasi-static schedulers for a wide range of applications. The results of this work show that actor composition with quasi-static scheduling can be used to transform data ow programs to t many different computer architecture with different type and number of cores. This in turn, enables dataflow to provide a more platform independent representation as one application can be fitted to a specific processor architecture without changing the actual program representation. Instead, the program representation is in the context of design space exploration optimized by the development tools to fit the target platform. This work focuses on representing the dataflow scheduling problem as a model checking problem and is implemented as part of a compiler infrastructure. The thesis also presents experimental results as evidence of the usefulness of the approach.
Resumo:
Diplomityössä perehdyttiin toimitusketjun hallinnan ja johtamisen apuvälineenä käytettävään SCOR-malliin (Supply Chain Operations Refence-model). SCOR on standardimalli, josta jokaisen tulee poimia oman toiminnan kehittämisen ja tehostamisen kannalta tärkeät asiat. SCOR on hyvä instrumentti kasvun hallinnassa.Työn teoriaosa käsittelee logistiikkaa, toimitusketjun hallintaa, ostotoimintaa ja SCOR-mallia. Soveltavassa osassa käsitellään SCOR-mallin avulla sisäilma alalla toimivan yrityksen prosesseja. Työssä keskityttiin tarkastelemaan yrityksen ostotoiminta ja valmistusprosesseja SCOR-mallin avulla.SCOR-mallin käyttö oston ja tuotannon analysoinnissa toi esiin useita toiminnan kehitysajatuksia. Näitä olivat uusien menetelmien käyttöönotto varaston ja tuotannon hallittavuuden parantamiseksi, osto-organisaation selkiyttäminen ja toiminnanohjausjärjestelmän kehittäminen.
Resumo:
Työ lähti liikkeelle Etelä-Karjalan sosiaali- ja terveyspiirin Apuvälinekeskuksen tarpeesta. Eksoten, apuvälinekeskuksen ja Etelä-Karjalan hankintapalveluiden perustaminen sekä muut organisaatiomuutokset muuttivat apuvälineiden hankintaprosessi niin paljon, että sen etenemisestä kaivattiin selventävää kuvausta. Julkisten hankintojen lain mukaan kilpailutettavat hankinnat siirtyivät hankintapalveluiden hoidettavaksi ja niitä varten tarvittiin tarjouspyyntömalli, jonka perusteella tarjouksia voidaan vertailla. Teoriaosuudessa käsitellään hankintaprosessia, julkisten hankintojen erityispiirteitä ja lakia julkisista hankinnoista. Lisäksi käsitellään apuvälinepalveluprosessia ja suosituksia apuvälineiden hankinnasta. Organisaation tutustuttiin henkilökohtaisesti sekä haastattelujen ja sähköpostikyselyiden avulla. Näitä tietoja hyödyntäen luotiin prosessikuvaukset sekä apuvälinepalveluprosessista että apuvälineiden kausisopimustuotteiden hankinnasta. Molempiin kuvauksiin sisältyy selittäviä lisäosia. Tarjouspyyntömallin kehittämisen pohjaksi pyydettiin sähköpostikyselyllä 10 alueellisesta apuvälinekeskuksesta tietoa heidän käyttämistään tarjouspyynnöistä. Vastauksia saatiin 4 kappaletta. Tiedoista tehtiin yhteenveto, joka luovutettiin apuvälinekeskuksen hankintatyöryhmän käyttöön.
Resumo:
This case study explored value proposition and relationship marketing de-terminants in the HVAC (Heating, Ventilation and Air Conditioning) indus-try. Concretely, the case involved Purmo, a prominent brand and market leader radiator manufacturer, its relationship marketing practices with the retailers of their product (radiator installers) and the value proposition which is being used to reach the end-user. In the field work, five heating experts/entrepreneurs in the installation business were interviewed and asked about their opinion on Purmo and the end-user’s needs. The findings suggest that while installers appreciate Purmo as a supplier and respect it as a company, the loyalty that they have towards it has no repercussions on their product advocacy to ultimate consumers. Installers proved to be attracted to standard model radiators and to be apathetic to the benefits that more advanced models can provide. The reasons for this behavior were found to be their preference for products with better availa-bility and their reluctance to interfere with the customers’ decision making processes.
Resumo:
Simplifying the Einstein field equation by assuming the cosmological principle yields a set of differential equations which governs the dynamics of the universe as described in the cosmological standard model. The cosmological principle assumes the space appears the same everywhere and in every direction and moreover, the principle has earned its position as a fundamental assumption in cosmology by being compatible with the observations of the 20th century. It was not until the current century when observations in cosmological scales showed significant deviation from isotropy and homogeneity implying the violation of the principle. Among these observations are the inconsistency between local and non-local Hubble parameter evaluations, baryon acoustic features of the Lyman-α forest and the anomalies of the cosmic microwave background radiation. As a consequence, cosmological models beyond the cosmological principle have been studied vastly; after all, the principle is a hypothesis and as such should frequently be tested as any other assumption in physics. In this thesis, the effects of inhomogeneity and anisotropy, arising as a consequence of discarding the cosmological principle, is investigated. The geometry and matter content of the universe becomes more cumbersome and the resulting effects on the Einstein field equation is introduced. The cosmological standard model and its issues, both fundamental and observational are presented. Particular interest is given to the local Hubble parameter, supernova explosion, baryon acoustic oscillation, and cosmic microwave background observations and the cosmological constant problems. Explored and proposed resolutions emerging by violating the cosmological principle are reviewed. This thesis is concluded by a summary and outlook of the included research papers.