27 resultados para Averaging Theorem


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Volatile organic compounds (VOCs) are emitted into the atmosphere from natural and anthropogenic sources, vegetation being the dominant source on a global scale. Some of these reactive compounds are deemed major contributors or inhibitors to aerosol particle formation and growth, thus making VOC measurements essential for current climate change research. This thesis discusses ecosystem scale VOC fluxes measured above a boreal Scots pine dominated forest in southern Finland. The flux measurements were performed using the micrometeorological disjunct eddy covariance (DEC) method combined with proton transfer reaction mass spectrometry (PTR-MS), which is an online technique for measuring VOC concentrations. The measurement, calibration, and calculation procedures developed in this work proved to be well suited to long-term VOC concentration and flux measurements with PTR-MS. A new averaging approach based on running averaged covariance functions improved the determination of the lag time between wind and concentration measurements, which is a common challenge in DEC when measuring fluxes near the detection limit. The ecosystem scale emissions of methanol, acetaldehyde, and acetone were substantial. These three oxygenated VOCs made up about half of the total emissions, with the rest comprised of monoterpenes. Contrary to the traditional assumption that monoterpene emissions from Scots pine originate mainly as evaporation from specialized storage pools, the DEC measurements indicated a significant contribution from de novo biosynthesis to the ecosystem scale monoterpene emissions. This thesis offers practical guidelines for long-term DEC measurements with PTR-MS. In particular, the new averaging approach to the lag time determination seems useful in the automation of DEC flux calculations. Seasonal variation in the monoterpene biosynthesis and the detailed structure of a revised hybrid algorithm, describing both de novo and pool emissions, should be determined in further studies to improve biological realism in the modelling of monoterpene emissions from Scots pine forests. The increasing number of DEC measurements of oxygenated VOCs will probably enable better estimates of the role of these compounds in plant physiology and tropospheric chemistry. Keywords: disjunct eddy covariance, lag time determination, long-term flux measurements, proton transfer reaction mass spectrometry, Scots pine forests, volatile organic compounds

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Nucleation is the first step in a phase transition where small nuclei of the new phase start appearing in the metastable old phase, such as the appearance of small liquid clusters in a supersaturated vapor. Nucleation is important in various industrial and natural processes, including atmospheric new particle formation: between 20 % to 80 % of atmospheric particle concentration is due to nucleation. These atmospheric aerosol particles have a significant effect both on climate and human health. Different simulation methods are often applied when studying things that are difficult or even impossible to measure, or when trying to distinguish between the merits of various theoretical approaches. Such simulation methods include, among others, molecular dynamics and Monte Carlo simulations. In this work molecular dynamics simulations of the homogeneous nucleation of Lennard-Jones argon have been performed. Homogeneous means that the nucleation does not occur on a pre-existing surface. The simulations include runs where the starting configuration is a supersaturated vapor and the nucleation event is observed during the simulation (direct simulations), as well as simulations of a cluster in equilibrium with a surrounding vapor (indirect simulations). The latter type are a necessity when the conditions prevent the occurrence of a nucleation event in a reasonable timeframe in the direct simulations. The effect of various temperature control schemes on the nucleation rate (the rate of appearance of clusters that are equally able to grow to macroscopic sizes and to evaporate) was studied and found to be relatively small. The method to extract the nucleation rate was also found to be of minor importance. The cluster sizes from direct and indirect simulations were used in conjunction with the nucleation theorem to calculate formation free energies for the clusters in the indirect simulations. The results agreed with density functional theory, but were higher than values from Monte Carlo simulations. The formation energies were also used to calculate surface tension for the clusters. The sizes of the clusters in the direct and indirect simulations were compared, showing that the direct simulation clusters have more atoms between the liquid-like core of the cluster and the surrounding vapor. Finally, the performance of various nucleation theories in predicting simulated nucleation rates was investigated, and the results among other things highlighted once again the inadequacy of the classical nucleation theory that is commonly employed in nucleation studies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The liquidity crisis that swept through the financial markets in 2007 triggered multi-billion losses and forced buyouts of some large banks. The resulting credit crunch is sometimes compared to the great recession in the early twentieth century. But the crisis also serves as a reminder of the significance of the interbank market and of proper central bank policy in this market. This thesis deals with implementation of monetary policy in the interbank market and examines how central bank tools affect commercial banks' decisions. I answer the following questions: • What is the relationship between the policy setup and interbank interest rate volatility? (averaging reserve requirement reduces the volatility) • What can explain a weak relationship between market liquidity and the interest rate? (high reserve requirement buffer) • What determines banks' decisions on when to satisfy the reserve requirement? (market frictions) • How did the liquidity crisis that began in 2007 affect interbank market behaviour? (resulted in higher credit risk and trading frictions as well as expected liquidity shortage)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A growing body of empirical research examines the structure and effectiveness of corporate governance systems around the world. An important insight from this literature is that corporate governance mechanisms address the excessive use of managerial discretionary powers to get private benefits by expropriating the value of shareholders. One possible way of expropriation is to reduce the quality of disclosed earnings by manipulating the financial statements. This lower quality of earnings should then be reflected by the stock price of firm according to value relevance theorem. Hence, instead of testing the direct effect of corporate governance on the firm’s market value, it is important to understand the causes of the lower quality of accounting earnings. This thesis contributes to the literature by increasing knowledge about the extent of the earnings management – measured as the extent of discretionary accruals in total disclosed earnings - and its determinants across the Transitional European countries. The thesis comprises of three essays of empirical analysis of which first two utilize the data of Russian listed firms whereas the third essay uses data from 10 European economies. More specifically, the first essay adds to existing research connecting earnings management to corporate governance. It testifies the impact of the Russian corporate governance reforms of 2002 on the quality of disclosed earnings in all publicly listed firms. This essay provides empirical evidence of the fact that the desired impact of reforms is not fully substantiated in Russia without proper enforcement. Instead, firm-level factors such as long-term capital investments and compliance with International financial reporting standards (IFRS) determine the quality of the earnings. The result presented in the essay support the notion proposed by Leuz et al. (2003) that the reforms aimed to bring transparency do not correspond to desired results in economies where investor protection is lower and legal enforcement is weak. The second essay focuses on the relationship between the internal-control mechanism such as the types and levels of ownership and the quality of disclosed earnings in Russia. The empirical analysis shows that the controlling shareholders in Russia use their powers to manipulate the reported performance in order to get private benefits of control. Comparatively, firms owned by the State have significantly better quality of disclosed earnings than other controllers such as oligarchs and foreign corporations. Interestingly, market performance of firms controlled by either State or oligarchs is better than widely held firms. The third essay provides useful evidence on the fact that both ownership structures and economic characteristics are important factors in determining the quality of disclosed earnings in three groups of countries in Europe. Evidence suggests that ownership structure is a more important determinant in developed and transparent countries, while economic determinants are important determinants in developing and transitional countries.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

After Gödel's incompleteness theorems and the collapse of Hilbert's programme Gerhard Gentzen continued the quest for consistency proofs of Peano arithmetic. He considered a finitistic or constructive proof still possible and necessary for the foundations of mathematics. For a proof to be meaningful, the principles relied on should be considered more reliable than the doubtful elements of the theory concerned. He worked out a total of four proofs between 1934 and 1939. This thesis examines the consistency proofs for arithmetic by Gentzen from different angles. The consistency of Heyting arithmetic is shown both in a sequent calculus notation and in natural deduction. The former proof includes a cut elimination theorem for the calculus and a syntactical study of the purely arithmetical part of the system. The latter consistency proof in standard natural deduction has been an open problem since the publication of Gentzen's proofs. The solution to this problem for an intuitionistic calculus is based on a normalization proof by Howard. The proof is performed in the manner of Gentzen, by giving a reduction procedure for derivations of falsity. In contrast to Gentzen's proof, the procedure contains a vector assignment. The reduction reduces the first component of the vector and this component can be interpreted as an ordinal less than epsilon_0, thus ordering the derivations by complexity and proving termination of the process.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the thesis I study various quantum coherence phenomena and create some of the foundations for a systematic coherence theory. So far, the approach to quantum coherence in science has been purely phenomenological. In my thesis I try to answer the question what quantum coherence is and how it should be approached within the framework of physics, the metatheory of physics and the terminology related to them. It is worth noticing that quantum coherence is a conserved quantity that can be exactly defined. I propose a way to define quantum coherence mathematically from the density matrix of the system. Degenerate quantum gases, i.e., Bose condensates and ultracold Fermi systems, form a good laboratory to study coherence, since their entropy is small and coherence is large, and thus they possess strong coherence phenomena. Concerning coherence phenomena in degenerate quantum gases, I concentrate in my thesis mainly on collective association from atoms to molecules, Rabi oscillations and decoherence. It appears that collective association and oscillations do not depend on the spin-statistics of particles. Moreover, I study the logical features of decoherence in closed systems via a simple spin-model. I argue that decoherence is a valid concept also in systems with a possibility to experience recoherence, i.e., Poincaré recurrences. Metatheoretically this is a remarkable result, since it justifies quantum cosmology: to study the whole universe (i.e., physical reality) purely quantum physically is meaningful and valid science, in which decoherence explains why the quantum physical universe appears to cosmologists and other scientists very classical-like. The study of the logical structure of closed systems also reveals that complex enough closed (physical) systems obey a principle that is similar to Gödel's incompleteness theorem of logic. According to the theorem it is impossible to describe completely a closed system within the system, and the inside and outside descriptions of the system can be remarkably different. Via understanding this feature it may be possible to comprehend coarse-graining better and to define uniquely the mutual entanglement of quantum systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The relationship between site characteristics and understorey vegetation composition was analysed with quantitative methods, especially from the viewpoint of site quality estimation. Theoretical models were applied to an empirical data set collected from the upland forests of southern Finland comprising 104 sites dominated by Scots pine (Pinus sylvestris L.), and 165 sites dominated by Norway spruce (Picea abies (L.) Karsten). Site index H100 was used as an independent measure of site quality. A new model for the estimation of site quality at sites with a known understorey vegetation composition was introduced. It is based on the application of Bayes' theorem to the density function of site quality within the study area combined with the species-specific presence-absence response curves. The resulting posterior probability density function may be used for calculating an estimate for the site variable. Using this method, a jackknife estimate of site index H100 was calculated separately for pine- and spruce-dominated sites. The results indicated that the cross-validation root mean squared error (RMSEcv) of the estimates improved from 2.98 m down to 2.34 m relative to the "null" model (standard deviation of the sample distribution) in pine-dominated forests. In spruce-dominated forests RMSEcv decreased from 3.94 m down to 3.16 m. In order to assess these results, four other estimation methods based on understorey vegetation composition were applied to the same data set. The results showed that none of the methods was clearly superior to the others. In pine-dominated forests, RMSEcv varied between 2.34 and 2.47 m, and the corresponding range for spruce-dominated forests was from 3.13 to 3.57 m.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Modern sample surveys started to spread after statistician at the U.S. Bureau of the Census in the 1940s had developed a sampling design for the Current Population Survey (CPS). A significant factor was also that digital computers became available for statisticians. In the beginning of 1950s, the theory was documented in textbooks on survey sampling. This thesis is about the development of the statistical inference for sample surveys. For the first time the idea of statistical inference was enunciated by a French scientist, P. S. Laplace. In 1781, he published a plan for a partial investigation in which he determined the sample size needed to reach the desired accuracy in estimation. The plan was based on Laplace s Principle of Inverse Probability and on his derivation of the Central Limit Theorem. They were published in a memoir in 1774 which is one of the origins of statistical inference. Laplace s inference model was based on Bernoulli trials and binominal probabilities. He assumed that populations were changing constantly. It was depicted by assuming a priori distributions for parameters. Laplace s inference model dominated statistical thinking for a century. Sample selection in Laplace s investigations was purposive. In 1894 in the International Statistical Institute meeting, Norwegian Anders Kiaer presented the idea of the Representative Method to draw samples. Its idea was that the sample would be a miniature of the population. It is still prevailing. The virtues of random sampling were known but practical problems of sample selection and data collection hindered its use. Arhtur Bowley realized the potentials of Kiaer s method and in the beginning of the 20th century carried out several surveys in the UK. He also developed the theory of statistical inference for finite populations. It was based on Laplace s inference model. R. A. Fisher contributions in the 1920 s constitute a watershed in the statistical science He revolutionized the theory of statistics. In addition, he introduced a new statistical inference model which is still the prevailing paradigm. The essential idea is to draw repeatedly samples from the same population and the assumption that population parameters are constants. Fisher s theory did not include a priori probabilities. Jerzy Neyman adopted Fisher s inference model and applied it to finite populations with the difference that Neyman s inference model does not include any assumptions of the distributions of the study variables. Applying Fisher s fiducial argument he developed the theory for confidence intervals. Neyman s last contribution to survey sampling presented a theory for double sampling. This gave the central idea for statisticians at the U.S. Census Bureau to develop the complex survey design for the CPS. Important criterion was to have a method in which the costs of data collection were acceptable, and which provided approximately equal interviewer workloads, besides sufficient accuracy in estimation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Tutkielman tarkoituksena oli soveltaa toistetun pelin teoria- ja empiriapohjaa suomalaiseen tutkimusaineistoon. Kartellin toimintadynamiikka on mallinnettu peliteorian osa-alueen, toistetun pelin kentäksi. Toistetussa pelissä samaa, kerran pelattua peliä pelataan useita kierroksia. Äärettömästi toistetusta pelistä muodostuu toistetun pelin yleinen teoria (The Folk Theorem), jossa jokaisella pelaajalla on yksilöllisesti rationaalinen käytössykli. Toisen pelaajan kanssa tehty yhteistyö kasvattaa pelaajan käytössykliltä kertyvää kokonaishyötyä. Kartellitutkimuksessa ei voi ohittaa oikeustieteellistä näkökulmaa, joten sekin on tiivistetysti mukana esityksessä. Äänettömässä tai implisiittisessä kartellissa ( tacit collusion ) ei avoimen kartellin tavoin ole osapuolten välistä kommunikointia, mutta sen lopputulos on sama. Tästä syystä äänetön kartelli on yhdenmukaistettuna käytöksenä kielletty. Koska myös tunnusmerkit ovat osin samat, kartellitutkimus on saanut arvokasta mittausaineistoa paljastuneiden kartellien käytöksestä. Pelkkään hintatiedostoonkin perustuvalla tutkimuksella on vankka teoreettinen ja empiirinen pohja. Oikeuskirjallisuudessa ja käytännössä hintayhteneväisyyden on yhdessä muiden tunnusmerkkien kanssa katsottu olevan indisio kartellista. Bensiinin vähittäismyyntimarkkinat ovat rakenteellisesti otollinen kenttä toistetulle pelille. Tutkielman empiirisessä osuudessa kohteena olivat pääkaupunkiseudun bensiinin vähittäismyyntimarkkinat ja tiedosto sisälsi otoksia hinta-aikasarjoista ajalta 1.8.2004 - 30.6.2005 kaikkiaan 116:ltä jakeluasemalta Espoosta, Helsingistä ja Vantaalta. Tutkimusmenetelmänä oli toistettujen mittausten varianssianalyysi post hoc-vertailuin. Tilastollisesti merkitsevä hinnoitteluyhtenevyys lähellä sijaitsevien asemien kesken löytyi 47 asemalta, ja näin ollen näillä asemilla on yksi kartellin tunnusmerkeistä. Hinnoitteluyhtenevyyden omaavat asemat muodostivat liikenneyhteyksien mukaan jaetuilla kilpailualueillaan ryhmittymiä ja kaikkiaan tällaisia yhtenevästi hinnoittelevia ryhmittymiä oli 21. Näistä ryhmittymistä 9 oli ns. sekapareja eli osapuolina olivat kylmäasema ja liikenneasema. Useimmissa tapauksissa oli kyseessä alueensa kalleimmin hinnoitteleva kylmäasema. Tutkielman tärkeimmät lähteet: Abrantes-Metz, Rosa M. – Froeb, Luke M. – Geweke, John F. – Taylor, Cristopher T. (2005): A Variance screen for collusion. Working paper no. 275, Bureau of economics, Federal Trade Commission, Washington DC 20580. Dutta, Prajit K. (1999): Strategies and Games, Theory and Practice. The MIT Press, Cambridge, Massachusetts, London, England. Harrington, Joseph E. (2004): Detecting cartels. Working paper. John Hopkins University. Ivaldi, Marc – Jullien, Bruno – Rey, Patric – Seabright, Paul – Tirole, Jean (2003): The Economics of Tacit Collusion. EU:n komission kilpailun pääosaston julkaisu. Phlips, Louis (1996): On the detection of collusion and predation. European Economic Review 40 (1996), 495–510.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Various Tb theorems play a key role in the modern harmonic analysis. They provide characterizations for the boundedness of Calderón-Zygmund type singular integral operators. The general philosophy is that to conclude the boundedness of an operator T on some function space, one needs only to test it on some suitable function b. The main object of this dissertation is to prove very general Tb theorems. The dissertation consists of four research articles and an introductory part. The framework is general with respect to the domain (a metric space), the measure (an upper doubling measure) and the range (a UMD Banach space). Moreover, the used testing conditions are weak. In the first article a (global) Tb theorem on non-homogeneous metric spaces is proved. One of the main technical components is the construction of a randomization procedure for the metric dyadic cubes. The difficulty lies in the fact that metric spaces do not, in general, have a translation group. Also, the measures considered are more general than in the existing literature. This generality is genuinely important for some applications, including the result of Volberg and Wick concerning the characterization of measures for which the analytic Besov-Sobolev space embeds continuously into the space of square integrable functions. In the second article a vector-valued extension of the main result of the first article is considered. This theorem is a new contribution to the vector-valued literature, since previously such general domains and measures were not allowed. The third article deals with local Tb theorems both in the homogeneous and non-homogeneous situations. A modified version of the general non-homogeneous proof technique of Nazarov, Treil and Volberg is extended to cover the case of upper doubling measures. This technique is also used in the homogeneous setting to prove local Tb theorems with weak testing conditions introduced by Auscher, Hofmann, Muscalu, Tao and Thiele. This gives a completely new and direct proof of such results utilizing the full force of non-homogeneous analysis. The final article has to do with sharp weighted theory for maximal truncations of Calderón-Zygmund operators. This includes a reduction to certain Sawyer-type testing conditions, which are in the spirit of Tb theorems and thus of the dissertation. The article extends the sharp bounds previously known only for untruncated operators, and also proves sharp weak type results, which are new even for untruncated operators. New techniques are introduced to overcome the difficulties introduced by the non-linearity of maximal truncations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis is concerned with the area of vector-valued Harmonic Analysis, where the central theme is to determine how results from classical Harmonic Analysis generalize to functions with values in an infinite dimensional Banach space. The work consists of three articles and an introduction. The first article studies the Rademacher maximal function that was originally defined by T. Hytönen, A. McIntosh and P. Portal in 2008 in order to prove a vector-valued version of Carleson's embedding theorem. The boundedness of the corresponding maximal operator on Lebesgue-(Bochner) -spaces defines the RMF-property of the range space. It is shown that the RMF-property is equivalent to a weak type inequality, which does not depend for instance on the integrability exponent, hence providing more flexibility for the RMF-property. The second article, which is written in collaboration with T. Hytönen, studies a vector-valued Carleson's embedding theorem with respect to filtrations. An earlier proof of the dyadic version assumed that the range space satisfies a certain geometric type condition, which this article shows to be also necessary. The third article deals with a vector-valued generalizations of tent spaces, originally defined by R. R. Coifman, Y. Meyer and E. M. Stein in the 80's, and concerns especially the ones related to square functions. A natural assumption on the range space is then the UMD-property. The main result is an atomic decomposition for tent spaces with integrability exponent one. In order to suit the stochastic integrals appearing in the vector-valued formulation, the proof is based on a geometric lemma for cones and differs essentially from the classical proof. Vector-valued tent spaces have also found applications in functional calculi for bisectorial operators. In the introduction these three themes come together when studying paraproduct operators for vector-valued functions. The Rademacher maximal function and Carleson's embedding theorem were applied already by Hytönen, McIntosh and Portal in order to prove boundedness for the dyadic paraproduct operator on Lebesgue-Bochner -spaces assuming that the range space satisfies both UMD- and RMF-properties. Whether UMD implies RMF is thus an interesting question. Tent spaces, on the other hand, provide a method to study continuous time paraproduct operators, although the RMF-property is not yet understood in the framework of tent spaces.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Fire is an important driver of the boreal forest ecosystem, and a useful tool for the restoration of degraded forests. However, we lack knowledge on the ecological processes initiated by prescribed fires, and whether they bring about the desired restoration effects. The purpose of this study was to investigate the impacts of low-intensity experimental prescribed fires on four ecological processes in young commercial Scots pine (Pinus sylvestris) stands eight years after the burning. The processes of interest were tree mortality, dead wood creation, regeneration and fire scar formation. These were inventoried in twelve study plots, which were 30 m x 30 m in size. The plots belonged to two different stand age classes: 30-35 years or 45 years old at the time of burning. The study was partly a follow-up of study plots researched by Sidoroff et al. (2007) one year after burning in 2003. Tree mortality increased from 183 stems ha-1 in 2003 to 259 stems ha-1 in 2010, corresponding to 15 % and 21 % of stem number respectively. Most mortality was experienced in the stands of the younger age class, in smaller diameter classes and among species other than Scots pine. By 2010, the average mortality of Scots pine per plot was 18%, but varied greatly ranging from 0% to 63% of stem number. Delayed mortality, i.e. mortality that occurred between 2 and 8 years after fire, seemed to become more important with increasing diameter. The input of dead wood also varied greatly between plots, from none to 72 m3 ha-1, averaging at 12 m3 ha-1. The amount of fire scarred trees per plot ranged from none to 20 %. Four out of twelve plots (43 %) did not have any fire scars. Scars were on average small: 95% of scars were less than 4 cm in width, and 75% less than 40 cm in length. Owing to the light nature of the fire, the remaining overstorey and thick organic layer, regeneration was poor overall. The abundance of pine and other seedlings indicated a viable seed source existed, but the seedlings failed to establish under dense canopy. The number of saplings ranged from 0 to 12 333 stems ha-1. The results of this study indicate that a low intensity fire does not necessarily initiate the ecological processes of tree mortality, dead wood creation and regeneration in the desired scale. Fire scars, which form the basis of fire dating in fire history studies, did not form in all cases.