11 resultados para CONSTANTS

em Helda - Digital Repository of University of Helsinki


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Bertrand Russell (1872 1970) introduced the English-speaking philosophical world to modern, mathematical logic and foundational study of mathematics. The present study concerns the conception of logic that underlies his early logicist philosophy of mathematics, formulated in The Principles of Mathematics (1903). In 1967, Jean van Heijenoort published a paper, Logic as Language and Logic as Calculus, in which he argued that the early development of modern logic (roughly the period 1879 1930) can be understood, when considered in the light of a distinction between two essentially different perspectives on logic. According to the view of logic as language, logic constitutes the general framework for all rational discourse, or meaningful use of language, whereas the conception of logic as calculus regards logic more as a symbolism which is subject to reinterpretation. The calculus-view paves the way for systematic metatheory, where logic itself becomes a subject of mathematical study (model-theory). Several scholars have interpreted Russell s views on logic with the help of the interpretative tool introduced by van Heijenoort,. They have commonly argued that Russell s is a clear-cut case of the view of logic as language. In the present study a detailed reconstruction of the view and its implications is provided, and it is argued that the interpretation is seriously misleading as to what he really thought about logic. I argue that Russell s conception is best understood by setting it in its proper philosophical context. This is constituted by Immanuel Kant s theory of mathematics. Kant had argued that purely conceptual thought basically, the logical forms recognised in Aristotelian logic cannot capture the content of mathematical judgments and reasonings. Mathematical cognition is not grounded in logic but in space and time as the pure forms of intuition. As against this view, Russell argued that once logic is developed into a proper tool which can be applied to mathematical theories, Kant s views turn out to be completely wrong. In the present work the view is defended that Russell s logicist philosophy of mathematics, or the view that mathematics is really only logic, is based on what I term the Bolzanian account of logic . According to this conception, (i) the distinction between form and content is not explanatory in logic; (ii) the propositions of logic have genuine content; (iii) this content is conferred upon them by special entities, logical constants . The Bolzanian account, it is argued, is both historically important and throws genuine light on Russell s conception of logic.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This dissertation explores the role of the German minister to Helsinki, Wipert von Blücher (1883-1963), within the German-Finnish relations of the late 1930s and the Second World War. Blücher was a key figure – and certainly one of the constants – within German Finland policy and the complex international diplomacy surrounding Finland. Despite representing Hitler’s Germany, he was not a National Socialist in the narrower sense of the term, but a conservative civil servant in the Wilhelmine tradition of the German foreign service. Along with a significant number of career diplomats, Blücher attempted to restrict National Socialist influence on the exercise of German foreign policy, whilst successfully negotiating a modus vivendi with the new regime. The study of his political biography in the Third Reich hence provides a highly representative example of how the traditional élites of Germany were caught in an cycle of conformity and, albeit tacit, opposition. Above all, however, the biographical study of Blücher and his behaviour offers an hitherto unexplored approach to the history of the German-Finnish relations. His unusually long tenure in Helsinki covered the period leading up to the so-called Winter War, which left Blücher severely distraught by Berlin’s effectively pro-Soviet neutrality and brought him close to resigning his post. It further extended to the German-Finnish rapprochement of 1940/41 and the military cooperation of both countries from mid-1941 to 1944. Throughout, Blücher developed a diverse and ambitious set of policy schemes, largely rooted in the tradition of Wilhelmine foreign policy. In their moderation and commonsensical realism, his designs – indeed his entire conception of foreign policy – clashed with the foreign political and ideological premises of the National Socialist regime. In its theoretical grounding, the analysis of Blücher’s political schemes is built on the concept of alternative policy and indebted to A.J.P. Taylor’s definition of dissent in foreign policy. It furthermore rests upon the assumption, introduced by Wolfgang Michalka, that National Socialist foreign policy was dominated by a plurality of rival conceptions, players, and institutions competing for Hitler’s favour (‘Konzeptionen-Pluralismus’). Although primarily a study in the history of international relations, my research has substantially benefited from more recent developments within cultural history, particularly research on nobility and élites, and the renewed focus on autobiography and conceptions of the self. On an abstract level, the thesis touches upon some of the basic components of German politics, political culture, and foreign policy in the first half of the 20th century: national belonging and conflicting loyalties, self-perception and representation, élites and their management of power, the modern history of German conservatism, the nature and practice of diplomacy, and, finally, the intricate relationship between the ethics of the professional civil service and absolute moral principles. Against this backdrop, the examination of Blücher’s role both within Finnish politics and the foreign policy of the Third Reich highlights the biographical dimension of the German-Finnish relationships, while fathoming the determinants of individual human agency in the process.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Le naturalisme finlandais. Une conception entropique du quotidien. Finnish Naturalism. An Entropic Conception of Everyday Life. Nineteenth century naturalism was a strikingly international literary movement. After emerging in France in the 1870s, it spread all over Europe including young, small nations with a relatively recent literary tradition, such as Finland. This thesis surveys the role and influence of French naturalism on the Finnish literature of the 1880s and 1890s. On the basis of a selection of works of six Finnish authors (Juhani Aho, Minna Canth, Kauppis-Heikki, Teuvo Pakkala, Ina Lange and Karl August Tavaststjerna), the study establishes a view of the main features of Finnish naturalism in comparison with that of French authors, such as Zola, Maupassant and Flaubert. The study s methodological framework is genre theory: even though naturalist writers insisted on a transparent description of reality, naturalist texts are firmly rooted in general generic categories with definable relations and constants on which European novels impose variations. By means of two key concepts, entropy and everyday life , this thesis establishes the parameters of the naturalist genre. At the heart of the naturalist novel is a movement in the direction of disintegration and confusion, from order to disorder, from illusion to disillusion. This entropic vision is merged into the representation of everyday life, focusing on socially mediocre characters and discovering their miseries in all their banality and daily grayness. By using Mikhail Bakhtin s idea of literary genres as a means of understanding experience, this thesis suggests that everyday life is an ideological core of naturalist literature that determines not only its thematic but also generic distinctions: with relation to other genres, such as to Balzac s realism, naturalism appears primarily to be a banalization of everyday life. In idyllic genres, everyday life can be represented by means of sublimation, but a naturalist novel establishes a distressing, negative everyday life and thus strives to take a critical view of the modern society. Beside the central themes, the study surveys the generic blends in naturalism. The thesis analyzes how the coalition of naturalism and the melodramatic mode in the work of Minna Canth serves naturalisms ambition to discover the unconscious instincts underlying daily realities, and how the symbolic mode in the work of Juhani Aho duplicates the semantic level of the apparently insignificant, everyday naturalist details. The study compares the naturalist novel to the ideological novel (roman à these) and surveys the central dilemma of naturalism, the confrontation between the optimistic belief in social reform and the pessimistic theory of determinism. The thesis proposes that the naturalist novel s contribution to social reform lies in its shock effect. By means of representing the unpleasant truth the entropy of everyday life it aims to scandalize the reader and make him aware of the harsh realities that might apply also to him.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Determination of testosterone and related compounds in body fluids is of utmost importance in doping control and the diagnosis of many diseases. Capillary electromigration techniques are a relatively new approach for steroid research. Owing to their electrical neutrality, however, separation of steroids by capillary electromigration techniques requires the use of charged electrolyte additives that interact with the steroids either specifically or non-specifically. The analysis of testosterone and related steroids by non-specific micellar electrokinetic chromatography (MEKC) was investigated in this study. The partial filling (PF) technique was employed, being suitable for detection by both ultraviolet spectrophotometry (UV) and electrospray ionization mass spectrometry (ESI-MS). Efficient, quantitative PF-MEKC UV methods for steroid standards were developed through the use of optimized pseudostationary phases comprising surfactants and cyclodextrins. PF-MEKC UV proved to be a more sensitive, efficient and repeatable method for the steroids than PF-MEKC ESI-MS. It was discovered that in PF-MEKC analyses of electrically neutral steroids, ESI-MS interfacing sets significant limitations not only on the chemistry affecting the ionization and detection processes, but also on the separation. The new PF-MEKC UV method was successfully employed in the determination of testosterone in male urine samples after microscale immunoaffinity solid-phase extraction (IA-SPE). The IA-SPE method, relying on specific interactions between testosterone and a recombinant anti-testosterone Fab fragment, is the first such method described for testosterone. Finally, new data for interactions between steroids and human and bovine serum albumins were obtained through the use of affinity capillary electrophoresis. A new algorithm for the calculation of association constants between proteins and neutral ligands is introduced.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Standard Model of particle physics consists of the quantum electrodynamics (QED) and the weak and strong nuclear interactions. The QED is the basis for molecular properties, and thus it defines much of the world we see. The weak nuclear interaction is responsible for decays of nuclei, among other things, and in principle, it should also effects at the molecular scale. The strong nuclear interaction is hidden in interactions inside nuclei. From the high-energy and atomic experiments it is known that the weak interaction does not conserve parity. Consequently, the weak interaction and specifically the exchange of the Z^0 boson between a nucleon and an electron induces small energy shifts of different sign for mirror image molecules. This in turn will make the other enantiomer of a molecule energetically favorable than the other and also shifts the spectral lines of the mirror image pair of molecules into different directions creating a split. Parity violation (PV) in molecules, however, has not been observed. The topic of this thesis is how the weak interaction affects certain molecular magnetic properties, namely certain parameters of nuclear magnetic resonance (NMR) and electron spin resonance (ESR) spectroscopies. The thesis consists of numerical estimates of NMR and ESR spectral parameters and investigations of the effects of different aspects of quantum chemical computations to them. PV contributions to the NMR shielding and spin-spin coupling constants are investigated from the computational point of view. All the aspects of quantum chemical electronic structure computations are found to be very important, which makes accurate computations challenging. Effects of molecular geometry are also investigated using a model system of polysilyene chains. PV contribution to the NMR shielding constant is found to saturate after the chain reaches a certain length, but the effects of local geometry can be large. Rigorous vibrational averaging is also performed for a relatively small and rigid molecule. Vibrational corrections to the PV contribution are found to be only a couple of per cents. PV contributions to the ESR g-tensor are also evaluated using a series of molecules. Unfortunately, all the estimates are below the experimental limits, but PV in some of the heavier molecules comes close to the present day experimental resolution.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Aminopolykarboksyylaatteja, kuten etyleenidiamiinitetraetikkahappoa (EDTA), on käytetty useiden vuosikymmenien ajan erinomaisen metalli-ionien sitomiskyvyn vuoksi kelatointiaineena lukuisissa sovelluksissa sekä analytiikassa että monilla teollisisuuden aloilla. Näiden yhdisteiden biohajoamattomuus on kuitenkin herättänyt huolta viime aikoina, sillä niiden on havaittu olevan hyvin pysyviä luonnossa. Tämä työ on osa laajempaa tutkimushanketta, jossa on tavoitteena löytää korvaavia kelatointiaineita EDTA:lle. Tutkimuksen aiheena on kuuden kelatointiaineen metalli-ionien sitomiskyvyn kartoitus. EDTA:a paremmin luonnossa hajoavina nämä ovat ympäristöystävällisiä ehdokkaita korvaaviksi kelatointiaineiksi useisiin sovelluksiin. Työssä tutkittiin niiden kompleksinmuodostusta useiden metalli-ionien kanssa potentiometrisella titrauksella. Metalli-ionivalikoima vaihteli hieman kelatointiaineesta riippuen sisältäen magnesium-, kalsium-, mangaani-, rauta-, kupari-, sinkki-, kadmium-, elohopea-, lyijy- ja lantaani-ionit. Tutkittavat metallit oli valittu tähtäimessä olevien sovellusten, synteesissä ilmenneiden ongelmien tai ympäristönäkökohtien perusteella. Tulokset osoittavat näiden yhdisteiden metallinsitomiskyvyn olevan jonkin verran heikompi kuin EDTA:lla, mutta kuitenkin riittävän useisiin sovelluksiin kuten sellunvalkaisuprosessiin. Myrkyllisten raskasmetallien, kadmiumin, elohopen ja lyijyn kohdalla EDTA:a heikompi sitoutuminen on eduksikin, koska se yhdistettynä parempaan biohajoavuuteen saattaa alentaa tutkittujen yhdisteiden kykyä mobilisoida kyseisiä metalleja sedimenteistä. Useimmilla tutkituista yhdisteistä on ympäristönäkökulmasta etuna myös EDTA:a pienempi typpipitoisuus.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Let X be a topological space and K the real algebra of the reals, the complex numbers, the quaternions, or the octonions. The functions form X to K form an algebra T(X,K) with pointwise addition and multiplication. We study first-order definability of the constant function set N' corresponding to the set of the naturals in certain subalgebras of T(X,K). In the vocabulary the symbols Constant, +, *, 0', and 1' are used, where Constant denotes the predicate defining the constants, and 0' and 1' denote the constant functions with values 0 and 1 respectively. The most important result is the following. Let X be a topological space, K the real algebra of the reals, the compelex numbers, the quaternions, or the octonions, and R a subalgebra of the algebra of all functions from X to K containing all constants. Then N' is definable in , if at least one of the following conditions is true. (1) The algebra R is a subalgebra of the algebra of all continuous functions containing a piecewise open mapping from X to K. (2) The space X is sigma-compact, and R is a subalgebra of the algebra of all continuous functions containing a function whose range contains a nonempty open set of K. (3) The algebra K is the set of reals or the complex numbers, and R contains a piecewise open mapping from X to K and does not contain an everywhere unbounded function. (4) The algebra R contains a piecewise open mapping from X to the set of the reals and function whose range contains a nonempty open subset of K. Furthermore R does not contain an everywhere unbounded function.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The study seeks to find out whether the real burden of the personal taxation has increased or decreased. In order to determine this, we investigate how the same real income has been taxed in different years. Whenever the taxes for the same real income for a given year are higher than for the base year, the real tax burden has increased. If they are lower, the real tax burden has decreased. The study thus seeks to estimate how changes in the tax regulations affect the real tax burden. It should be kept in mind that the progression in the central government income tax schedule ensures that a real change in income will bring about a change in the tax ration. In case of inflation when the tax schedules are kept nominally the same will also increase the real tax burden. In calculations of the study it is assumed that the real income remains constant, so that we can get an unbiased measure of the effects of governmental actions in real terms. The main factors influencing the amount of income taxes an individual must pay are as follows: - Gross income (income subject to central and local government taxes). - Deductions from gross income and taxes calculated according to tax schedules. - The central government income tax schedule (progressive income taxation). - The rates for the local taxes and for social security payments (proportional taxation). In the study we investigate how much a certain group of taxpayers would have paid in taxes according to the actual tax regulations prevailing indifferent years if the income were kept constant in real terms. Other factors affecting tax liability are kept strictly unchanged (as constants). The resulting taxes, expressed in fixed prices, are then compared to the taxes levied in the base year (hypothetical taxation). The question we are addressing is thus how much taxes a certain group of taxpayers with the same socioeconomic characteristics would have paid on the same real income according to the actual tax regulations prevailing in different years. This has been suggested as the main way to measure real changes in taxation, although there are several alternative measures with essentially the same aim. Next an aggregate indicator of changes in income tax rates is constructed. It is designed to show how much the taxation of income has increased or reduced from one year to next year on average. The main question remains: How aggregation over all income levels should be performed? In order to determine the average real changes in the tax scales the difference functions (difference between actual and hypothetical taxation functions) were aggregated using taxable income as weights. Besides the difference functions, the relative changes in real taxes can be used as indicators of change. In this case the ratio between the taxes computed according to the new and the old situation indicates whether the taxation has become heavier or easier. The relative changes in tax scales can be described in a way similar to that used in describing the cost of living, or by means of price indices. For example, we can use Laspeyres´ price index formula for computing the ratio between taxes determined by the new tax scales and the old tax scales. The formula answers the question: How much more or less will be paid in taxes according to the new tax scales than according to the old ones when the real income situation corresponds to the old situation. In real terms the central government tax burden experienced a steady decline from its high post-war level up until the mid-1950s. The real tax burden then drifted upwards until the mid-1970s. The real level of taxation in 1975 was twice that of 1961. In the 1980s there was a steady phase due to the inflation corrections of tax schedules. In 1989 the tax schedule fell drastically and from the mid-1990s tax schedules have decreased the real tax burden significantly. Local tax rates have risen continuously from 10 percent in 1948 to nearly 19 percent in 2008. Deductions have lowered the real tax burden especially in recent years. Aggregate figures indicate how the tax ratio for the same real income has changed over the years according to the prevailing tax regulations. We call the tax ratio calculated in this manner the real income tax ratio. A change in the real income tax ratio depicts an increase or decrease in the real tax burden. The real income tax ratio declined after the war for some years. In the beginning of the 1960s it nearly doubled to mid-1970. From mid-1990s the real income tax ratio has fallen about 35 %.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The protein kinases (PKs) belong to the largest single family of enzymes, phosphotransferases, which catalyze the phosphorylation of other enzymes and proteins and function primarily in signal transduction. Consequently, PKs regulate cell mechanisms such as growth, differentiation, and proliferation. Dysfunction of these cellular mechanisms may lead to cancer, a major predicament in health care. Even though there is a range of clinically available cancer-fighting drugs, increasing number of cancer cases and setbacks such as drug resistance, constantly keep cancer research active. At the commencement of this study an isophthalic acid derivative had been suggested to bind to the regulatory domain of protein kinase C (PKC). In order to investigate the biological effects and structure-activity relationships (SARs) of this new chemical entity, a library of compounds was synthesized. The best compounds induced apoptosis in human leukemia HL-60 cells and were not cytotoxic in Swiss 3T3 fibroblasts. In addition, the best apoptosis inducers were neither cytotoxic nor mutagenic. Furthermore, results from binding affinity assays of PKC isoforms revealed the pharmacophores of these isophthalic acid derivatives. The best inhibition constants of the tested compounds were measured to 210 nM for PKCα and to 530 nM for PKCδ. Among natural compounds targeting the regulatory domain of PKC, the target of bistramide A has been a matter of debate. It was initially found to activate PKCδ; however, actin was recently reported as the main target. In order to clarify and to further study the biological effects of bistramide A, the total syntheses of the natural compound and two isomers were performed. Biological assays of the compounds revealed accumulation of 4n polyploid cells as the primary mode of action and the compounds showed similar overall antiproliferative activities. However, each compound showed a distinct distribution of antimitotic effect presumably via actin binding, proapoptotic effect presumably via PKCδ, and pro-differentiation effect as evidenced by CD11b expression. Furthermore, it was shown that the antimitotic and proapoptotic effects of bistramide A were not secondary effects of actin binding but independent effects. The third aim in this study was to synthesize a library of a new class of urea-based type II inhibitors targeted at the kinase domain of anaplastic lymphoma kinase (ALK). The best compounds in this library showed IC50 values as low as 390 nM for ALK while the initial low cellular activities were successfully increased even by more than 70 times for NPM-ALK- positive BaF3 cells. More importantly, selective antiproliferative activity on ALK-positive cell lines was achieved; while the best compound affected the BaF3 and SU-DHL-1 cells with IC50 values of 0.5 and 0.8 μM, respectively, they were less toxic to the NPM-ALK-negative human leukemic cells U937 (IC50 = 3.2 μM) and BaF3 parental cells (IC50 = 5.4 μM). Furthermore, SAR studies of the synthesized compounds revealed functional groups and positions of the scaffold, which enhanced the enzymatic and cellular activities.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

"Litter quality and environmental effects on Scots pine (Pinus sylvestris L.) fine woody debris (FWD) decomposition were examined in three forestry-drained peatlands representing different site types along a climatic gradient from the north boreal (Northern Finland) to south (Southern Finland) and hemiboreal (Central Estonia) conditions. Decomposition (percent mass loss) of FWD with diameter <= 10 mm (twigs) and FWD with diameter > 10 mm (branches) was measured using the litter bag method over 1-4-year periods. Overall, decomposition rates increased from north to south, the rate constants (k values) varying from 0.128 to 0.188 year(-1) and from 0.066 to 0.127 year(-1) for twigs and branches, respectively. On average, twigs had lost 34%, 19% and 19%, and branches 25%, 17% and 11% of their initial mass after 2 years of decomposition at the hemiboreal, south boreal and north boreal sites, respectively. After 4 years at the south boreal site the values were 48% for twigs and 42% for branches. Based on earlier studies, we suggest that the decomposition rates that we determined may be used for estimating Scots pine FWD decomposition in the boreal zone, also in upland forests. Explanatory models accounted for 50.4% and 71.2% of the total variation in FWD decomposition rates when the first two and all years were considered, respectively. The variables most related to FWD decomposition included the initial ash, water extractives and Klason lignin content of litter, and cumulative site precipitation minus potential evapotranspiration. Simulations of inputs and decomposition of Scots pine FWD and needle litter in south boreal conditions over a 60-year period showed that 72 g m(-2) of organic matter from FWD vs. 365 g m(-2) from needles accumulated in the forest floor. The annual inputs varied from 5.7 to 15.6 g m(-2) and from 92 to 152 g m(-2) for FWD and needles, respectively. Each thinning caused an increase in FWD inputs, Up to 510 g m(-2), while the needle inputs did not change dramatically. Because the annual FWD inputs were lowered following the thinnings, the overall effect of thinnings on C accumulation from FWD was slightly negative. The contribution of FWD to soil C accumulation, relative to needle litter, seems to be rather minor in boreal Scots pine forests. (C) 2008 Elsevier B.V. All rights reserved."

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Modern sample surveys started to spread after statistician at the U.S. Bureau of the Census in the 1940s had developed a sampling design for the Current Population Survey (CPS). A significant factor was also that digital computers became available for statisticians. In the beginning of 1950s, the theory was documented in textbooks on survey sampling. This thesis is about the development of the statistical inference for sample surveys. For the first time the idea of statistical inference was enunciated by a French scientist, P. S. Laplace. In 1781, he published a plan for a partial investigation in which he determined the sample size needed to reach the desired accuracy in estimation. The plan was based on Laplace s Principle of Inverse Probability and on his derivation of the Central Limit Theorem. They were published in a memoir in 1774 which is one of the origins of statistical inference. Laplace s inference model was based on Bernoulli trials and binominal probabilities. He assumed that populations were changing constantly. It was depicted by assuming a priori distributions for parameters. Laplace s inference model dominated statistical thinking for a century. Sample selection in Laplace s investigations was purposive. In 1894 in the International Statistical Institute meeting, Norwegian Anders Kiaer presented the idea of the Representative Method to draw samples. Its idea was that the sample would be a miniature of the population. It is still prevailing. The virtues of random sampling were known but practical problems of sample selection and data collection hindered its use. Arhtur Bowley realized the potentials of Kiaer s method and in the beginning of the 20th century carried out several surveys in the UK. He also developed the theory of statistical inference for finite populations. It was based on Laplace s inference model. R. A. Fisher contributions in the 1920 s constitute a watershed in the statistical science He revolutionized the theory of statistics. In addition, he introduced a new statistical inference model which is still the prevailing paradigm. The essential idea is to draw repeatedly samples from the same population and the assumption that population parameters are constants. Fisher s theory did not include a priori probabilities. Jerzy Neyman adopted Fisher s inference model and applied it to finite populations with the difference that Neyman s inference model does not include any assumptions of the distributions of the study variables. Applying Fisher s fiducial argument he developed the theory for confidence intervals. Neyman s last contribution to survey sampling presented a theory for double sampling. This gave the central idea for statisticians at the U.S. Census Bureau to develop the complex survey design for the CPS. Important criterion was to have a method in which the costs of data collection were acceptable, and which provided approximately equal interviewer workloads, besides sufficient accuracy in estimation.