13 resultados para 230110 Calculus of Variations and Control Theory
em Helda - Digital Repository of University of Helsinki
Resumo:
This work investigates the role of narrative literature in late-20th century and contemporary Anglo-American moral philosophy. It aims to show the trend of reading narrative literature for purposes of moral philosophy from the 1970 s and early 80 s to the present day as a part of a larger movement in Anglo-American moral philosophy, and to present a view of its significance for moral philosophy overall. Chapter 1 provides some preliminaries concerning the view of narrative literature which my discussion builds on. In chapter 2 I give an outline of how narrative literature is considered in contemporary Anglo-American moral philosophy, and connect this use to the broad trend of neo-Aristotelian ethics in this context. In chapter 3 I connect the use of literature to the idea of the non-generalizability of moral perception and judgment, which is central to the neo-Aristotelian trend, as well as to a range of moral particularisms and anti-theoretical positions of late 20th century and contemporary ethics. The joint task of chapters 2 and 3 is to situate the trend of reading narrative literature for the purposes of moral philosophy in the present context of moral philosophy. In the following two chapters, 4 and 5, I move on from the particularizing power of narrative literature, which is emphasized by neo-Aristotelians and particularists alike, to a broader under-standing of the intellectual potential of narrative literature. In chapter 4 I argue that narrative literature has its own forms of generalization which are enriching for our understanding of the workings of ethical generalizations in philosophy. In chapter 5 I discuss Iris Murdoch s and Martha Nussbaum s respective ways of combining ethical generality and particularity in a philosophical framework where both systematic moral theory and narrative literature are taken seriously. In chapter 6 I analyse the controversy between contemporary anti-theoretical conceptions of ethics and Nussbaum s refutation of these. I present my suggestion for how the significance of the ethics/literature discussion for moral philosophy can be understood if one wants to overcome the limitations of both Nussbaum s theory-centred, equilibrium-seeking perspective, and the anti-theorists repudiation of theory. I call my position the inclusive approach .
Resumo:
Bertrand Russell (1872 1970) introduced the English-speaking philosophical world to modern, mathematical logic and foundational study of mathematics. The present study concerns the conception of logic that underlies his early logicist philosophy of mathematics, formulated in The Principles of Mathematics (1903). In 1967, Jean van Heijenoort published a paper, Logic as Language and Logic as Calculus, in which he argued that the early development of modern logic (roughly the period 1879 1930) can be understood, when considered in the light of a distinction between two essentially different perspectives on logic. According to the view of logic as language, logic constitutes the general framework for all rational discourse, or meaningful use of language, whereas the conception of logic as calculus regards logic more as a symbolism which is subject to reinterpretation. The calculus-view paves the way for systematic metatheory, where logic itself becomes a subject of mathematical study (model-theory). Several scholars have interpreted Russell s views on logic with the help of the interpretative tool introduced by van Heijenoort,. They have commonly argued that Russell s is a clear-cut case of the view of logic as language. In the present study a detailed reconstruction of the view and its implications is provided, and it is argued that the interpretation is seriously misleading as to what he really thought about logic. I argue that Russell s conception is best understood by setting it in its proper philosophical context. This is constituted by Immanuel Kant s theory of mathematics. Kant had argued that purely conceptual thought basically, the logical forms recognised in Aristotelian logic cannot capture the content of mathematical judgments and reasonings. Mathematical cognition is not grounded in logic but in space and time as the pure forms of intuition. As against this view, Russell argued that once logic is developed into a proper tool which can be applied to mathematical theories, Kant s views turn out to be completely wrong. In the present work the view is defended that Russell s logicist philosophy of mathematics, or the view that mathematics is really only logic, is based on what I term the Bolzanian account of logic . According to this conception, (i) the distinction between form and content is not explanatory in logic; (ii) the propositions of logic have genuine content; (iii) this content is conferred upon them by special entities, logical constants . The Bolzanian account, it is argued, is both historically important and throws genuine light on Russell s conception of logic.
Resumo:
In the future the number of the disabled drivers requiring a special evaluation of their driving ability will increase due to the ageing population, as well as the progress of adaptive technology. This places pressure on the development of the driving evaluation system. Despite quite intensive research there is still no consensus concerning what is the factual situation in a driver evaluation (methodology), which measures should be included in an evaluation (methods), and how an evaluation has to be carried out (practise). In order to find answers to these questions we carried out empirical studies, and simultaneously elaborated upon a conceptual model for driving and a driving evaluation. The findings of empirical studies can be condensed into the following points: 1) A driving ability defined by the on-road driving test is associated with different laboratory measures depending on the study groups. Faults in the laboratory tests predicted faults in the on-road driving test in the novice group, whereas slowness in the laboratory predicted driving faults in the experienced drivers group. 2) The Parkinson study clearly showed that even an experienced clinician cannot reliably accomplish an evaluation of a disabled person’s driving ability without collaboration with other specialists. 3) The main finding of the stroke study was that the use of a multidisciplinary team as a source of information harmonises the specialists’ evaluations. 4) The patient studies demonstrated that the disabled persons themselves, as well as their spouses, are as a rule not reliable evaluators. 5) From the safety point of view, perceptible operations with the control devices are not crucial, but correct mental actions which the driver carries out with the help of the control devices are of greatest importance. 6) Personality factors including higher-order needs and motives, attitudes and a degree of self-awareness, particularly a sense of illness, are decisive when evaluating a disabled person’s driving ability. Personality is also the main source of resources concerning compensations for lower-order physical deficiencies and restrictions. From work with the conceptual model we drew the following methodological conclusions: First, the driver has to be considered as a holistic subject of the activity, as a multilevel hierarchically organised system of an organism, a temperament, an individuality, and a personality where the personality is the leading subsystem from the standpoint of safety. Second, driving as a human form of a sociopractical activity, is also a hierarchically organised dynamic system. Third, in an evaluation of driving ability it is a question of matching these two hierarchically organised structures: a subject of an activity and a proper activity. Fourth, an evaluation has to be person centred but not disease-, function- or method centred. On the basis of our study a multidisciplinary team (practitioner, driving school teacher, psychologist, occupational therapist) is recommended for use in demanding driver evaluations. Primary in a driver’s evaluations is a coherent conceptual model while concrete methods of evaluations may vary. However, the on-road test must always be performed if possible.
Detection of major mite pests of Apis mellifera and development of non-chemical control of varroasis
Resumo:
Despite of improving levels of hygiene, the incidence of registered food borne disease has been at the same level for many years: there were 40 to 90 epidemics in which 1000-9000 persons contracted food poisoning through food or drinking water in Finland. Until the year 2004 salmonella and campylobacter were the most common bacterial causes of food borne diseases, but in years 2005-2006 Bacillus cereus was the most common. Similar developement has been published i.e. in Germany already in the 1990´s. One reason for this can be Bacillus cereus and its emetic toxin, cereulide. Bacillus cereus is a common environmental bacterium that contaminates raw materials of food. Otherwise than salmonella and campylobacter, Bacillus cereus is a heat resistant bacterium, capable of surviving most cooking procedures due to the production of highly thermo resistant spores. The food involved has usually been heat treated and surviving spores are the source of the food poisoning. The heat treatment induces germination of the spore and the vegetative cells then produce toxins. This doctoral thesis research focuses on developing methods for assessing and eliminating risks to food safety by cereulide producing Bacillus cereus. The biochemistry and physiology of cereulide production was investigated and the results were targeted to offer tools for minimizing toxin risk in food during the production. I developed methods for the extraction and quantitative analysis of cereulide directly from food. A prerequisite for that is knowledge of the chemical and physical properties of the toxin. Because cereulide is practically insoluble in water, I used organic solvents; methanol, ethanol and pentane for the extraction. For extraction of bakery products I used high temperature (100C) and pressure (103.4 bars). Alternaties for effective extraction is to flood the plain food with ethanol, followed by stationary equilibration at room temperature. I used this protocol for extracting cereulide from potato puree and penne. Using this extraction method it is also possible also extract cereulide from liquid food, like milk. These extraction methods are important improvement steps for studying of Bacillus cereus emetic food poisonings. Prior my work, cereulide extraction was done using water. As the result, the yield was poor and variable. To investigate suspected food poisonings, it is important to show actual toxicity of the incriminated food. Many toxins, but not cereulide, inactivate during food processing like heating. The next step is to identify toxin by chemical methods. I developed with my colleague Maria Andesson a rapid assay for the detection of cereulide toxicity, within 5 to 15 minutes. By applying this test it is possible to rapidly detect which food was causing the food poisoning. The chemical identification of cereulide was achieved using mass spectrometry. I used cereulide specific molecular ions, m/z (+/-0.3) 1153.8 (M+H+), 1171.0 (M+NH4+), 1176.0 (M+Na+) and 1191.7 (M+K+) for reliable identification. I investigated foods to find out their amenability to accumulate cereulide. Cereulide was formed high amounts (0.3 to 5.5 microg/g wet wt) when of cereulide producing B. cereus strains were present in beans, rice, rice-pastry and meat-pastry, if stored at non refrigerated temperatures (21-23C). Rice and meat pastries are frequently consumed under conditions where no cooled storage is available e.g. picnics and outdoor events. Bacillus cereus is a ubiquitous spore former and is therefore difficult to eliminate from foods. It is therefore important to know which conditions will affect the formation of cereulide in foods. My research showed that the cereulide content was strongly (10 to 1000 fold differences in toxin content) affected by the growth environment of the bacterium. Storage of foods under nitrogen atmosphere (> 99.5 %) prevented the production of cereulide. But when also carbon dioxide was present, minimizing the oxygen contant (< 1%) did not protect the food from formation of cereulide in preliminary experiments. Also food supplements affected cereulide production at least in the laboratory. Adding free amino acids, leucine and valine, stimulated cereulide production 10 to 20 fold. In peptide bonded form these amino acids are natural constituents in all proteins. Interestingly, adding peptide bonded leucine and valine had no significant effect on cereulide production. Free amino acids leucine and valine are approved food supplements and widely used as flawour modifiers in food technology. My research showed that these food supplements may increase food poisoning risk even though they are not toxic themselves.
Resumo:
Atomic Layer Deposition (ALD) is a chemical, gas-phase thin film deposition method. It is known for its ability for accurate and precise thickness control, and uniform and conformal film growth. One area where ALD has not yet excelled is film deposition at low temperatures. Also deposition of metals, besides the noble metals, has proven to be quite challenging. To alleviate these limitations, more aggressive reactants are required. One such group of reactants are radicals, which may be formed by dissociating gases. Dissociation is most conveniently done with a plasma source. For example, dissociating molecular oxygen or hydrogen, oxygen or hydrogen radicals are generated. The use of radicals in ALD may surmount some of the above limitations: oxide film deposition at low temperatures may become feasible if oxygen radicals are used as they are highly reactive. Also, as hydrogen radicals are very effective reducing agents, they may be used to deposit metals. In this work, a plasma source was incorporated in an existing ALD reactor for radical generation, and the reactor was used to study five different Radical Enhanced ALD processes. The modifications to the existing reactor and the different possibilities during the modification process are discussed. The studied materials include two metals, copper and silver, and three oxides, aluminium oxide, titanium dioxide and tantalum oxide. The materials were characterized and their properties were compared to other variations of the same process, utilizing the same metal precursor, to understand what kind of effect the non-metal precursor has on the film properties and growth characteristics. Both metals were deposited successfully, and silver for the first time by ALD. The films had low resistivity and grew conformally in the ALD mode, demonstrating that the REALD of metals is true ALD. The oxide films had exceptionally high growth rates, and aluminium oxide grew at room temperature with low cycle times and resulted in good quality films. Both aluminium oxide and titanium dioxide were deposited on natural fibres without damaging the fibre. Tantalum oxide was also deposited successfully, with good electrical properties, but at slightly higher temperature than the other two oxides, due to the evaporation temperature required by the metal precursor. Overall, the ability of REALD to deposit metallic and oxide films with high quality at low temperatures was demonstrated.
Resumo:
This PhD Thesis is about certain infinite-dimensional Grassmannian manifolds that arise naturally in geometry, representation theory and mathematical physics. From the physics point of view one encounters these infinite-dimensional manifolds when trying to understand the second quantization of fermions. The many particle Hilbert space of the second quantized fermions is called the fermionic Fock space. A typical element of the fermionic Fock space can be thought to be a linear combination of the configurations m particles and n anti-particles . Geometrically the fermionic Fock space can be constructed as holomorphic sections of a certain (dual)determinant line bundle lying over the so called restricted Grassmannian manifold, which is a typical example of an infinite-dimensional Grassmannian manifold one encounters in QFT. The construction should be compared with its well-known finite-dimensional analogue, where one realizes an exterior power of a finite-dimensional vector space as the space of holomorphic sections of a determinant line bundle lying over a finite-dimensional Grassmannian manifold. The connection with infinite-dimensional representation theory stems from the fact that the restricted Grassmannian manifold is an infinite-dimensional homogeneous (Kähler) manifold, i.e. it is of the form G/H where G is a certain infinite-dimensional Lie group and H its subgroup. A central extension of G acts on the total space of the dual determinant line bundle and also on the space its holomorphic sections; thus G admits a (projective) representation on the fermionic Fock space. This construction also induces the so called basic representation for loop groups (of compact groups), which in turn are vitally important in string theory / conformal field theory. The Thesis consists of three chapters: the first chapter is an introduction to the backround material and the other two chapters are individually written research articles. The first article deals in a new way with the well-known question in Yang-Mills theory, when can one lift the action of the gauge transformation group on the space of connection one forms to the total space of the Fock bundle in a compatible way with the second quantized Dirac operator. In general there is an obstruction to this (called the Mickelsson-Faddeev anomaly) and various geometric interpretations for this anomaly, using such things as group extensions and bundle gerbes, have been given earlier. In this work we give a new geometric interpretation for the Faddeev-Mickelsson anomaly in terms of differentiable gerbes (certain sheaves of categories) and central extensions of Lie groupoids. The second research article deals with the question how to define a Dirac-like operator on the restricted Grassmannian manifold, which is an infinite-dimensional space and hence not in the landscape of standard Dirac operator theory. The construction relies heavily on infinite-dimensional representation theory and one of the most technically demanding challenges is to be able to introduce proper normal orderings for certain infinite sums of operators in such a way that all divergences will disappear and the infinite sum will make sense as a well-defined operator acting on a suitable Hilbert space of spinors. This research article was motivated by a more extensive ongoing project to construct twisted K-theory classes in Yang-Mills theory via a Dirac-like operator on the restricted Grassmannian manifold.
Resumo:
The book presents a reconstruction, interpretation and critical evaluation of the Schumpeterian theoretical approach to socio-economic change. The analysis focuses on the problem of social evolution, on the interpretation of the innovation process and business cycles and, finally, on Schumpeter s optimistic neglect of ecological-environmental conditions as possible factors influencing social-economic change. The author investigates how the Schumpeterian approach describes the process of social and economic evolution, and how the logic of transformations is described, explained and understood in the Schumpeterian theory. The material of the study includes Schumpeter s works written after 1925, a related part of the commentary literature on these works, and a selected part of the related literature on the innovation process, technological transformations and the problem of long waves. Concerning the period after 1925, the Schumpeterian oeuvre is conceived and analysed as a more or less homogenous corpus of texts. The book is divided into 9 chapters. Chapters 1-2 describe the research problems and methods. Chapter 3 is an effort to provide a systematic reconstruction of Schumpeter's ideas concerning social and economic evolution. Chapters 4 and 5 focus their analysis on the innovation process. In Chapters 6 and 7 Schumpeter's theory of business cycles is examined. Chapter 8 evaluates Schumpeter's views concerning his relative neglect of ecological-environmental conditions as possible factors influencing social-economic change. Finally, chapter 9 draws the main conclusions.
Resumo:
Agriculture’s contribution to climate change is controversial as it is a significant source of greenhouse gases but also a sink of carbon. Hence its economic and technological potential to mitigate climate change have been argued to be noteworthy. However, social profitability of emission mitigation is a result from factors among emission reductions such as surface water quality impact or profit from production. Consequently, to value comprehensive results of agricultural climate emission mitigation practices, these co-effects to environment and economics should be taken into account. The objective of this thesis was to develop an integrated economic and ecological model to analyse the social welfare of crop cultivation in Finland on distinctive cultivation technologies, conventional tillage and conservation tillage (no-till). Further, we ask whether it would be privately or socially profitable to allocate some of barley cultivation for alternative land use, such as green set-aside or afforestation, when production costs, GHG’s and water quality impacts are taken into account. In the theoretical framework we depict the optimal input use and land allocation choices in terms of environmental impacts and profit from production and derive the optimal tax and payment policies for climate and water quality friendly land allocation. The empirical application of the model uses Finnish data about production cost and profit structure and environmental impacts. According to our results, given emission mitigation practices are not self-evidently beneficial for farmers or society. On the contrary, in some cases alternative land allocation could even reduce social welfare, profiting conventional crop cultivation. This is the case regarding mineral soils such as clay and silt soils. On organic agricultural soils, climate mitigation practices, in this case afforestation and green fallow give more promising results, decreasing climate emissions and nutrient runoff to water systems. No-till technology does not seem to profit climate mitigation although it does decrease other environmental impacts. Nevertheless, the data behind climate emission mitigation practices impact to production and climate is limited and partly contradictory. More specific experiment studies on interaction of emission mitigation practices and environment would be needed. Further study would be important. Particularly area specific production and environmental factors and also food security and safety and socio-economic impacts should be taken into account.
Resumo:
In this thesis the current status and some open problems of noncommutative quantum field theory are reviewed. The introduction aims to put these theories in their proper context as a part of the larger program to model the properties of quantized space-time. Throughout the thesis, special focus is put on the role of noncommutative time and how its nonlocal nature presents us with problems. Applications in scalar field theories as well as in gauge field theories are presented. The infinite nonlocality of space-time introduced by the noncommutative coordinate operators leads to interesting structure and new physics. High energy and low energy scales are mixed, causality and unitarity are threatened and in gauge theory the tools for model building are drastically reduced. As a case study in noncommutative gauge theory, the Dirac quantization condition of magnetic monopoles is examined with the conclusion that, at least in perturbation theory, it cannot be fulfilled in noncommutative space.
Resumo:
Listeria monocytogenes is the causative agent of the severe foodborne infection listeriosis. The number of listeriosis cases in recent years has increased in many European countries, including Finland. Contamination of the pathogen needs to be minimized and growth to high numbers in foods prevented in order to reduce the incidence of human cases. The aim of this study was to evaluate contamination routes of L. monocytogenes in the food chain and to investigate methods for control of the pathogen in food processing. L. monocytogenes was commonly found in wild birds, the pig production chain and in pork production plants. It was found most frequently in birds feeding at landfill site, organic farms, tonsil samples, and sites associated with brining. L. monococytogenes in birds, farms, food processing plant or foods did not form distinct genetic groups, but populations overlapped. The majority of genotypes recovered from birds were also detected in foods, food processing environments and other animal species and birds may disseminate L. monocytogenes into food chain. Similar genotypes were found in different pigs on the same farm, as well as in pigs on farms and later in the slaughterhouse. L. monocytogenes contamination spreads at farm level and may be a contamination source into slaughterhouses and further into meat. Incoming raw pork in the processing plant was frequently contaminated with L. monocytogenes and genotypes in raw meat were also found in processing environment and in RTE products. Thus, raw material seems to be a considerable source of contamination into processing facilities. In the pork processing plant, the prevalence of L. monocytogenes increased in the brining area, showing that the brining was an important contamination site. Recovery of the inoculated L. monocytogenes strains showed that there were strain-specific differences in the ability to survive in lettuce and dry sausage. The ability of some L. monocytogenes strains to survive well in food production raises a challenge for industry, because these strains can be especially difficult to remove from the products and raises a need to use an appropriate hurdle concept to control most resistant strains. Control of L. monocytogenes can be implemented throughout the food chain. Farm-specific factors affected the prevalence of L. monocytogenes and good farm-level practices can therefore be utilized to reduce the prevalence of this pathogen on the farm and possibly further in the food chain. Well separated areas in a pork production plant had low prevalences of L. monocytogenes, thus showing that compartmentalization controls the pathogen in the processing line. The food processing plant, especially the brining area, should be subjected to disassembling, extensive cleaning and disinfection to eliminate persistent contamination by L. monocytogenes, and replacing brining with dry-salting should be considered. All of the evaluated washing solutions decreased the populations of L. monocytogenes on precut lettuce, but did not eliminate the pathogen. Thus, the safety of fresh-cut produce cannot rely on washing with disinfectants, and high-quality raw material and good manufacturing practices remain important. L. monocytogenes was detected in higher levels in sausages without the protective culture than in sausages with this protective strain, although numbers of L. monocytogenes by the end of the ripening decreased to the level of < 100 MPN/g in all sausages. Protective starter cultures provide an appealing hurdle in dry sausage processing and assist in the control of L. monocytogenes.