876 resultados para dS vacua in string theory
Resumo:
The main idea of this study was to find out how immigrants understand and define successful co-operation and professional partnership in early childhood education. Another target of this research was to think over how the parents see professional partnership from their viewpoint, and how willing / ready the they are in engaging in the professional partnership with the day care personnel. The theoretical part of this research is based on theories of immigration and theories of it s different forms, theories of cultural varieties and theory of modernizing co-operation through using professional partnership. Also guidelines and policies for day care and early childhood education play a part in the theory section. Theory part is written to support research problems. The research method used in this study is peer interview. The interviewed are both immigrants and customers of day care services. The data collected is comprised of materials from peer interviews and personal background information. The interviewed were of Somalia and Russian ethnic groups. Interview were carried out in each group in the participants own mother tongue. These peer interviews showed that parents were interested and willing to discuss professional partnership. From this research one can conclude that the term professional partnership is seen as a complex term, and as a term difficult to understand. From the results it is seen that quite often the principles of professional partnership are not carried out in practise. According to the material gathered, the parents feel that lack of common language and prejudice against immigrants effectively prevents the professional partnership from being formed. The cultural differences can become challenging in a professional partnership. Based on this research, one can conclude that when different cultures meet, there has to be mutual will to understand and to be understood in order to make sure that the children s development, both educational and physical, is supported in a best possible way.
Resumo:
Shear flows of inelastic spheres in three dimensions in the Volume fraction range 0.4-0.64 are analysed using event-driven simulations.Particle interactions are considered to be due to instantaneous binary collisions, and the collision model has a normal coefficient of restitution e(n) (negative of the ratio of the post- and pre-collisional relative velocities of the particles along the line joining the centres) and a tangential coefficient of restitution e(t) (negative of the ratio of post- and pre-collisional velocities perpendicular to the line Joining the centres). Here, we have considered both e(t) = +1 and e(t) = e(n) (rough particles) and e(t) =-1 (smooth particles), and the normal coefficient of restitution e(n) was varied in the range 0.6-0.98. Care was taken to avoid inelastic collapse and ensure there are no particle overlaps during the simulation. First, we studied the ordering in the system by examining the icosahedral order parameter Q(6) in three dimensions and the planar order parameter q(6) in the plane perpendicular to the gradient direction. It was found that for shear flows of sufficiently large size, the system Continues to be in the random state, with Q(6) and q(6) close to 0, even for volume fractions between phi = 0.5 and phi = 0.6; in contrast, for a system of elastic particles in the absence of shear, the system orders (crystallizes) at phi = 0.49. This indicates that the shear flow prevents ordering in a system of sufficiently large size. In a shear flow of inelastic particles, the strain rate and the temperature are related through the energy balance equation, and all time scales can be non-dimensionalized by the inverse of the strain rate. Therefore, the dynamics of the system are determined only by the volume fraction and the coefficients of restitution. The variation of the collision frequency with volume fraction and coefficient of estitution was examined. It was found, by plotting the inverse of the collision frequency as a function of volume fraction, that the collision frequency at constant strain rate diverges at a volume fraction phi(ad) (volume fraction for arrested dynamics) which is lower than the random close-packing Volume fraction 0.64 in the absence of shear. The volume fraction phi(ad) decreases as the coefficient of restitution is decreased from e(n) = 1; phi(ad) has a minimum of about 0.585 for coefficient of restitution e(n) in the range 0.6-0.8 for rough particles and is slightly larger for smooth particles. It is found that the dissipation rate and all components of the stress diverge proportional to the collision frequency in the close-packing limit. The qualitative behaviour of the increase in the stress and dissipation rate are well Captured by results derived from kinetic theory, but the quantitative agreement is lacking even if the collision frequency obtained from simulations is used to calculate the pair correlation function used In the theory.
Resumo:
Since the first investigation 25 years ago, the application of genetic tools to address ecological and evolutionary questions in elasmobranch studies has greatly expanded. Major developments in genetic theory as well as in the availability, cost effectiveness and resolution of genetic markers were instrumental for particularly rapid progress over the last 10 years. Genetic studies of elasmobranchs are of direct importance and have application to fisheries management and conservation issues such as the definition of management units and identification of species from fins. In the future, increased application of the most recent and emerging technologies will enable accelerated genetic data production and the development of new markers at reduced costs, paving the way for a paradigm shift from gene to genome-scale research, and more focus on adaptive rather than just neutral variation. Current literature is reviewed in six fields of elasmobranch molecular genetics relevant to fisheries and conservation management (species identification, phylogeography, philopatry, genetic effective population size, molecular evolutionary rate and emerging methods). Where possible, examples from the Indo-Pacific region, which has been underrepresented in previous reviews, are emphasized within a global perspective. (C) 2012 The Authors Journal of Fish Biology (C) 2012 The Fisheries Society of the British Isles
Resumo:
Climate change is the single biggest environmental problem in the world at the moment. Although the effects are still not fully understood and there is considerable amount of uncertainty, many na-tions have decided to mitigate the change. On the societal level, a planner who tries to find an eco-nomically optimal solution to an environmental pollution problem seeks to reduce pollution from the sources where reductions are most cost-effective. This study aims to find out how effective the instruments of the agricultural policy are in the case of climate change mitigation in Finland. The theoretical base of this study is the neoclassical economic theory that is based on the assumption of a rational economic agent who maximizes his own utility. This theoretical base has been widened towards the direction clearly essential to the matter: the theory of environmental eco-nomics. Deeply relevant to this problem and central in the theory of environmental economics are the concepts of externalities and public goods. What are also relevant are the problems of global pollution and non-point-source pollution. Econometric modelling was the method that was applied to this study. The Finnish part of the AGMEMOD-model, covering the whole EU, was used for the estimation of the development of pollution. This model is a seemingly recursive, partially dynamic partial-equilibrium model that was constructed to predict the development of Finnish agricultural production of the most important products. For the study, I personally updated the model and also widened its scope in some relevant matters. Also, I devised a table that can calculate the emissions of greenhouse gases according to the rules set by the IPCC. With the model I investigated five alternative scenarios in comparison to the base-line scenario of Agenda 2000 agricultural policy. The alternative scenarios were: 1) the CAP reform of 2003, 2) free trade on agricultural commodities, 3) technological change, 4) banning the cultivation of organic soils and 5) the combination of the last three scenarios as the maximal achievement in reduction. The maximal achievement in the alternative scenario 5 was 1/3 of the level achieved on the base-line scenario. CAP reform caused only a minor reduction when com-pared to the base-line scenario. Instead, the free trade scenario and the scenario of technological change alone caused a significant reduction. The biggest single reduction was achieved by banning the cultivation of organic land. However, this was also the most questionable scenario to be real-ized, the reasons for this are further elaborated in the paper. The maximal reduction that can be achieved in the Finnish agricultural sector is about 11 % of the emission reduction that is needed to comply with the Kyoto protocol.
Resumo:
One of the unanswered questions of modern cosmology is the issue of baryogenesis. Why does the universe contain a huge amount of baryons but no antibaryons? What kind of a mechanism can produce this kind of an asymmetry? One theory to explain this problem is leptogenesis. In the theory right-handed neutrinos with heavy Majorana masses are added to the standard model. This addition introduces explicit lepton number violation to the theory. Instead of producing the baryon asymmetry directly, these heavy neutrinos decay in the early universe. If these decays are CP-violating, then they produce lepton number. This lepton number is then partially converted to baryon number by the electroweak sphaleron process. In this work we start by reviewing the current observational data on the amount of baryons in the universe. We also introduce Sakharov's conditions, which are the necessary criteria for any theory of baryogenesis. We review the current data on neutrino oscillation, and explain why this requires the existence of neutrino mass. We introduce the different kinds of mass terms which can be added for neutrinos, and explain how the see-saw mechanism naturally explains the observed mass scales for neutrinos motivating the addition of the Majorana mass term. After introducing leptogenesis qualitatively, we derive the Boltzmann equations governing leptogenesis, and give analytical approximations for them. Finally we review the numerical solutions for these equations, demonstrating the capability of leptogenesis to explain the observed baryon asymmetry. In the appendix simple Feynman rules are given for theories with interactions between both Dirac- and Majorana-fermions and these are applied at the tree level to calculate the parameters relevant for the theory.
Resumo:
There is limited understanding about how insect movement patterns are influenced by landscape features, and how landscapes can be managed to suppress pest phytophage populations in crops. Theory suggests that the relative timing of pest and natural enemy arrival in crops may influence pest suppression. However, there is a lack of data to substantiate this claim. We investigate the movement patterns of insects from native vegetation (NV) and discuss the implications of these patterns for pest control services. Using bi-directional interception traps we quantified the number of insects crossing an NV/crop ecotone relative to a control crop/crop interface in two agricultural regions early in the growing season. We used these data to infer patterns of movement and net flux. At the community-level, insect movement patterns were influenced by ecotone in two out of three years by region combinations. At the functional-group level, pests and parasitoids showed similar movement patterns from NV very soon after crop emergence. However, movement across the control interface increased towards the end of the early-season sampling period. Predators consistently moved more often from NV into crops than vice versa, even after crop emergence. Not all species showed a significant response to ecotone, however when a response was detected, these species showed similar patterns between the two regions. Our results highlight the importance of NV for the recruitment of natural enemies for early season crop immigration that may be potentially important for pest suppression. However, NV was also associated with crop immigration by some pest species. Hence, NV offers both opportunities and risks for pest management. The development of targeted NV management may reduce the risk of crop immigration by pests, but not of natural enemies.
Resumo:
The topic of this dissertation lies in the intersection of harmonic analysis and fractal geometry. We particulary consider singular integrals in Euclidean spaces with respect to general measures, and we study how the geometric structure of the measures affects certain analytic properties of the operators. The thesis consists of three research articles and an overview. In the first article we construct singular integral operators on lower dimensional Sierpinski gaskets associated with homogeneous Calderón-Zygmund kernels. While these operators are bounded their principal values fail to exist almost everywhere. Conformal iterated function systems generate a broad range of fractal sets. In the second article we prove that many of these limit sets are porous in a very strong sense, by showing that they contain holes spread in every direction. In the following we connect these results with singular integrals. We exploit the fractal structure of these limit sets, in order to establish that singular integrals associated with very general kernels converge weakly. Boundedness questions consist a central topic of investigation in the theory of singular integrals. In the third article we study singular integrals of different measures. We prove a very general boundedness result in the case where the two underlying measures are separated by a Lipshitz graph. As a consequence we show that a certain weak convergence holds for a large class of singular integrals.
Resumo:
The research in model theory has extended from the study of elementary classes to non-elementary classes, i.e. to classes which are not completely axiomatizable in elementary logic. The main theme has been the attempt to generalize tools from elementary stability theory to cover more applications arising in other branches of mathematics. In this doctoral thesis we introduce finitary abstract elementary classes, a non-elementary framework of model theory. These classes are a special case of abstract elementary classes (AEC), introduced by Saharon Shelah in the 1980's. We have collected a set of properties for classes of structures, which enable us to develop a 'geometric' approach to stability theory, including an independence calculus, in a very general framework. The thesis studies AEC's with amalgamation, joint embedding, arbitrarily large models, countable Löwenheim-Skolem number and finite character. The novel idea is the property of finite character, which enables the use of a notion of a weak type instead of the usual Galois type. Notions of simplicity, superstability, Lascar strong type, primary model and U-rank are inroduced for finitary classes. A categoricity transfer result is proved for simple, tame finitary classes: categoricity in any uncountable cardinal transfers upwards and to all cardinals above the Hanf number. Unlike the previous categoricity transfer results of equal generality the theorem does not assume the categoricity cardinal being a successor. The thesis consists of three independent papers. All three papers are joint work with Tapani Hyttinen.
Resumo:
In this thesis we study a few games related to non-wellfounded and stationary sets. Games have turned out to be an important tool in mathematical logic ranging from semantic games defining the truth of a sentence in a given logic to for example games on real numbers whose determinacies have important effects on the consistency of certain large cardinal assumptions. The equality of non-wellfounded sets can be determined by a so called bisimulation game already used to identify processes in theoretical computer science and possible world models for modal logic. Here we present a game to classify non-wellfounded sets according to their branching structure. We also study games on stationary sets moving back to classical wellfounded set theory. We also describe a way to approximate non-wellfounded sets with hereditarily finite wellfounded sets. The framework used to do this is domain theory. In the Banach-Mazur game, also called the ideal game, the players play a descending sequence of stationary sets and the second player tries to keep their intersection stationary. The game is connected to precipitousness of the corresponding ideal. In the pressing down game first player plays regressive functions defined on stationary sets and the second player responds with a stationary set where the function is constant trying to keep the intersection stationary. This game has applications in model theory to the determinacy of the Ehrenfeucht-Fraisse game. We show that it is consistent that these games are not equivalent.
Resumo:
A central tenet in the theory of reliability modelling is the quantification of the probability of asset failure. In general, reliability depends on asset age and the maintenance policy applied. Usually, failure and maintenance times are the primary inputs to reliability models. However, for many organisations, different aspects of these data are often recorded in different databases (e.g. work order notifications, event logs, condition monitoring data, and process control data). These recorded data cannot be interpreted individually, since they typically do not have all the information necessary to ascertain failure and preventive maintenance times. This paper presents a methodology for the extraction of failure and preventive maintenance times using commonly-available, real-world data sources. A text-mining approach is employed to extract keywords indicative of the source of the maintenance event. Using these keywords, a Naïve Bayes classifier is then applied to attribute each machine stoppage to one of two classes: failure or preventive. The accuracy of the algorithm is assessed and the classified failure time data are then presented. The applicability of the methodology is demonstrated on a maintenance data set from an Australian electricity company.
Resumo:
E-government provides a platform for governments to implement web enabled services that facilitate communication between citizens and the government. However, technology driven design approach and limited understanding of citizens' requirements, have led to a number of critical usability problems on the government websites. Hitherto, there has been no systematic attempt to analyse the way in which theory of User Centred Design (UCD) can contribute to address the usability issues of government websites. This research seeks to fill this gap by synthesising perspectives drawn from the study of User Centred Design and examining them based on the empirical data derived from case study of the Scottish Executive website. The research employs a qualitative approach in the collection and analysis of data. The triangulated analysis of the findings reveals that e-government web designers take commercial development approach and focus only on technical implementations which lead to websites that do not meet citizens' expectations. The research identifies that e-government practitioners can overcome web usability issues by transferring the theory of UCD to practice.
Resumo:
The intention of this note is to motivate the researchers to study Hadwiger's conjecture for circular arc graphs. Let η(G) denote the largest clique minor of a graph G, and let χ(G) denote its chromatic number. Hadwiger's conjecture states that η(G)greater-or-equal, slantedχ(G) and is one of the most important and difficult open problems in graph theory. From the point of view of researchers who are sceptical of the validity of the conjecture, it is interesting to study the conjecture for graph classes where η(G) is guaranteed not to grow too fast with respect to χ(G), since such classes of graphs are indeed a reasonable place to look for possible counterexamples. We show that in any circular arc graph G, η(G)less-than-or-equals, slant2χ(G)−1, and there is a family with equality. So, it makes sense to study Hadwiger's conjecture for this family.
Resumo:
- Objectives To explore if active learning principles be applied to nursing bioscience assessments and will this influence student perception of confidence in applying theory to practice? - Design and Data Sources A review of the literature utilising searches of various databases including CINAHL, PUBMED, Google Scholar and Mosby's Journal Index. - Methods The literature search identified research from twenty-six original articles, two electronic books, one published book and one conference proceedings paper. - Results Bioscience has been identified as an area that nurses struggle to learn in tertiary institutions and then apply to clinical practice. A number of problems have been identified and explored that may contribute to this poor understanding and retention. University academics need to be knowledgeable of innovative teaching and assessing modalities that focus on enhancing student learning and address the integration issues associated with the theory practice gap. Increased bioscience education is associated with improved patient outcomes therefore by addressing this “bioscience problem” and improving the integration of bioscience in clinical practice there will subsequently be an improvement in health care outcomes. - Conclusion From the literature several themes were identified. First there are many problems with teaching nursing students bioscience education. These include class sizes, motivation, concentration, delivery mode, lecturer perspectives, student's previous knowledge, anxiety, and a lack of confidence. Among these influences the type of assessment employed by the educator has not been explored or identified as a contributor to student learning specifically in nursing bioscience instruction. Second that educating could be achieved more effectively if active learning principles were applied and the needs and expectations of the student were met. Lastly, assessment influences student retention and the student experience and as such assessment should be congruent with the subject content, align with the learning objectives and be used as a stimulus tool for learning.
Resumo:
The electroweak theory is the part of the standard model of particle physics that describes the weak and electromagnetic interactions between elementary particles. Since its formulation almost 40 years ago, it has been experimentally verified to a high accuracy and today it has a status as one of the cornerstones of particle physics. Thermodynamics of electroweak physics has been studied ever since the theory was written down and the features the theory exhibits at extreme conditions remain an interesting research topic even today. In this thesis, we consider some aspects of electroweak thermodynamics. Specifically, we compute the pressure of the standard model to high precision and study the structure of the electroweak phase diagram when finite chemical potentials for all the conserved particle numbers in the theory are introduced. In the first part of the thesis, the theory, methods and essential results from the computations are introduced. The original research publications are reprinted at the end.
Resumo:
This study offers a reconstruction and critical evaluation of globalization theory, a perspective that has been central for sociology and cultural studies in recent decades, from the viewpoint of media and communications. As the study shows, sociological and cultural globalization theorists rely heavily on arguments concerning media and communications, especially the so-called new information and communication technologies, in the construction of their frameworks. Together with deepening the understanding of globalization theory, the study gives new critical knowledge of the problematic consequences that follow from such strong investment in media and communications in contemporary theory. The book is divided into four parts. The first part presents the research problem, the approach and the theoretical contexts of the study. Followed by the introduction in Chapter 1, I identify the core elements of globalization theory in Chapter 2. At the heart of globalization theory is the claim that recent decades have witnessed massive changes in the spatio-temporal constitution of society, caused by new media and communications in particular, and that these changes necessitate the rethinking of the foundations of social theory as a whole. Chapter 3 introduces three paradigms of media research the political economy of media, cultural studies and medium theory the discussion of which will make it easier to understand the key issues and controversies that emerge in academic globalization theorists treatment of media and communications. The next two parts offer a close reading of four theorists whose works I use as entry points into academic debates on globalization. I argue that we can make sense of mainstream positions on globalization by dividing them into two paradigms: on the one hand, media-technological explanations of globalization and, on the other, cultural globalization theory. As examples of the former, I discuss the works of Manuel Castells (Chapter 4) and Scott Lash (Chapter 5). I maintain that their analyses of globalization processes are overtly media-centric and result in an unhistorical and uncritical understanding of social power in an era of capitalist globalization. A related evaluation of the second paradigm (cultural globalization theory), as exemplified by Arjun Appadurai and John Tomlinson, is presented in Chapter 6. I argue that due to their rejection of the importance of nation states and the notion of cultural imperialism for cultural analysis, and their replacement with a framework of media-generated deterritorializations and flows, these theorists underplay the importance of the neoliberalization of cultures throughout the world. The fourth part (Chapter 7) presents a central research finding of this study, namely that the media-centrism of globalization theory can be understood in the context of the emergence of neoliberalism. I find it problematic that at the same time when capitalist dynamics have been strengthened in social and cultural life, advocates of globalization theory have directed attention to media-technological changes and their sweeping socio-cultural consequences, instead of analyzing the powerful material forces that shape the society and the culture. I further argue that this shift serves not only analytical but also utopian functions, that is, the longing for a better world in times when such longing is otherwise considered impracticable.