40 resultados para Non-polarizable Water Models


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Phytoplankton ecology and productivity is one of the main branches of contemporary oceanographic research. Research groups in this branch have increasingly started to utilise bio-optical applications. My main research objective was to critically investigate the advantages and deficiencies of the fast repetition rate (FRR) fluorometry for studies of productivity of phytoplankton, and the responses of phytoplankton towards varying environmental stress. Second, I aimed to clarify the applicability of the FRR system to the optical environment of the Baltic Sea. The FRR system offers a highly dynamic tool for studies of phytoplankton photophysiology and productivity both in the field and in a controlled environment. The FRR metrics obtain high-frequency in situ determinations of the light-acclimative and photosynthetic parameters of intact phytoplankton communities. The measurement protocol is relatively easy to use without phases requiring analytical determinations. The most notable application of the FRR system lies in its potential for making primary productivity (PP) estimations. However, the realisation of this scheme is not straightforward. The FRR-PP, based on the photosynthetic electron flow (PEF) rate, are linearly related to the photosynthetic gas exchange (fixation of 14C) PP only in environments where the photosynthesis is light-limited. If the light limitation is not present, as is usually the case in the near-surface layers of the water column, the two PP approaches will deviate. The prompt response of the PEF rate to the short-term variability in the natural light field makes the field comparisons between the PEF-PP and the 14C-PP difficult to interpret, because this variability is averaged out in the 14C-incubations. Furthermore, the FRR based PP models are tuned to closely follow the vertical pattern of the underwater irradiance. Due to the photoacclimational plasticity of phytoplankton, this easily leads to overestimates of water column PP, if precautionary measures are not taken. Natural phytoplankton is subject to broad-waveband light. Active non-spectral bio-optical instruments, like the FRR fluorometer, emit light in a relatively narrow waveband, which by its nature does not represent the in situ light field. Thus, the spectrally-dependent parameters provided by the FRR system need to be spectrally scaled to the natural light field of the Baltic Sea. In general, the requirement of spectral scaling in the water bodies under terrestrial impact concerns all light-adaptive parameters provided by any active non-spectral bio-optical technique. The FRR system can be adopted to studies of all phytoplankton that possess efficient light harvesting in the waveband matching the bluish FRR excitation. Although these taxa cover the large bulk of all the phytoplankton taxa, one exception with a pronounced ecological significance is found in the Baltic Sea. The FRR system cannot be used to monitor the photophysiology of the cyanobacterial taxa harvesting light in the yellow-red waveband. These taxa include the ecologically-significant bloom-forming cyanobacterial taxa in the Baltic Sea.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Climate change contributes directly or indirectly to changes in species distributions, and there is very high confidence that recent climate warming is already affecting ecosystems. The Arctic has already experienced the greatest regional warming in recent decades, and the trend is continuing. However, studies on the northern ecosystems are scarce compared to more southerly regions. Better understanding of the past and present environmental change is needed to be able to forecast the future. Multivariate methods were used to explore the distributional patterns of chironomids in 50 shallow (≤ 10m) lakes in relation to 24 variables determined in northern Fennoscandia at the ecotonal area from the boreal forest in the south to the orohemiarctic zone in the north. Highest taxon richness was noted at middle elevations around 400 m a.s.l. Significantly lower values were observed from cold lakes situated in the tundra zone. Lake water alkalinity had the strongest positive correlation with the taxon richness. Many taxa had preference for lakes either on tundra area or forested area. The variation in the chironomid abundance data was best correlated with sediment organic content (LOI), lake water total organic carbon content, pH and air temperature, with LOI being the strongest variable. Three major lake groups were separated on the basis of their chironomid assemblages: (i) small and shallow organic-rich lakes, (ii) large and base-rich lakes, and (iii) cold and clear oligotrophic tundra lakes. Environmental variables best discriminating the lake groups were LOI, taxon richness, and Mg. When repeated, this kind of an approach could be useful and efficient in monitoring the effects of global change on species ranges. Many species of fast spreading insects, including chironomids, show a remarkable ability to track environmental changes. Based on this ability, past environmental conditions have been reconstructed using their chitinous remains in the lake sediment profiles. In order to study the Holocene environmental history of subarctic aquatic systems, and quantitatively reconstruct the past temperatures at or near the treeline, long sediment cores covering the last 10000 years (the Holocene) were collected from three lakes. Lower temperature values than expected based on the presence of pine in the catchment during the mid-Holocene were reconstructed from a lake with great water volume and depth. The lake provided thermal refuge for profundal, cold adapted taxa during the warm period. In a shallow lake, the decrease in the reconstructed temperatures during the late Holocene may reflect the indirect response of the midges to climate change through, e.g., pH change. The results from three lakes indicated that the response of chironomids to climate have been more or less indirect. However, concurrent shifts in assemblages of chironomids and vegetation in two lakes during the Holocene time period indicated that the midges together with the terrestrial vegetation had responded to the same ultimate cause, which most likely was the Holocene climate change. This was also supported by the similarity in the long-term trends in faunal succession for the chironomid assemblages in several lakes in the area. In northern Finnish Lapland the distribution of chironomids were significantly correlated with physical and limnological factors that are most likely to change as a result of future climate change. The indirect and individualistic response of aquatic systems, as reconstructed using the chironomid assemblages, to the climate change in the past suggests that in the future, the lake ecosystems in the north do not respond in one predictable way to the global climate change. Lakes in the north may respond to global climate change in various ways that are dependent on the initial characters of the catchment area and the lake.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Accurate and stable time series of geodetic parameters can be used to help in understanding the dynamic Earth and its response to global change. The Global Positioning System, GPS, has proven to be invaluable in modern geodynamic studies. In Fennoscandia the first GPS networks were set up in 1993. These networks form the basis of the national reference frames in the area, but they also provide long and important time series for crustal deformation studies. These time series can be used, for example, to better constrain the ice history of the last ice age and the Earth s structure, via existing glacial isostatic adjustment models. To improve the accuracy and stability of the GPS time series, the possible nuisance parameters and error sources need to be minimized. We have analysed GPS time series to study two phenomena. First, we study the refraction in the neutral atmosphere of the GPS signal, and, second, we study the surface loading of the crust by environmental factors, namely the non-tidal Baltic Sea, atmospheric load and varying continental water reservoirs. We studied the atmospheric effects on the GPS time series by comparing the standard method to slant delays derived from a regional numerical weather model. We have presented a method for correcting the atmospheric delays at the observational level. The results show that both standard atmosphere modelling and the atmospheric delays derived from a numerical weather model by ray-tracing provide a stable solution. The advantage of the latter is that the number of unknowns used in the computation decreases and thus, the computation may become faster and more robust. The computation can also be done with any processing software that allows the atmospheric correction to be turned off. The crustal deformation due to loading was computed by convolving Green s functions with surface load data, that is to say, global hydrology models, global numerical weather models and a local model for the Baltic Sea. The result was that the loading factors can be seen in the GPS coordinate time series. Reducing the computed deformation from the vertical time series of GPS coordinates reduces the scatter of the time series; however, the long term trends are not influenced. We show that global hydrology models and the local sea surface can explain up to 30% of the GPS time series variation. On the other hand atmospheric loading admittance in the GPS time series is low, and different hydrological surface load models could not be validated in the present study. In order to be used for GPS corrections in the future, both atmospheric loading and hydrological models need further analysis and improvements.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fusion energy is a clean and safe solution for the intricate question of how to produce non-polluting and sustainable energy for the constantly growing population. The fusion process does not result in any harmful waste or green-house gases, since small amounts of helium is the only bi-product that is produced when using the hydrogen isotopes deuterium and tritium as fuel. Moreover, deuterium is abundant in seawater and tritium can be bred from lithium, a common metal in the Earth's crust, rendering the fuel reservoirs practically bottomless. Due to its enormous mass, the Sun has been able to utilize fusion as its main energy source ever since it was born. But here on Earth, we must find other means to achieve the same. Inertial fusion involving powerful lasers and thermonuclear fusion employing extreme temperatures are examples of successful methods. However, these have yet to produce more energy than they consume. In thermonuclear fusion, the fuel is held inside a tokamak, which is a doughnut-shaped chamber with strong magnets wrapped around it. Once the fuel is heated up, it is controlled with the help of these magnets, since the required temperatures (over 100 million degrees C) will separate the electrons from the nuclei, forming a plasma. Once the fusion reactions occur, excess binding energy is released as energetic neutrons, which are absorbed in water in order to produce steam that runs turbines. Keeping the power losses from the plasma low, thus allowing for a high number of reactions, is a challenge. Another challenge is related to the reactor materials, since the confinement of the plasma particles is not perfect, resulting in particle bombardment of the reactor walls and structures. Material erosion and activation as well as plasma contamination are expected. Adding to this, the high energy neutrons will cause radiation damage in the materials, causing, for instance, swelling and embrittlement. In this thesis, the behaviour of a material situated in a fusion reactor was studied using molecular dynamics simulations. Simulations of processes in the next generation fusion reactor ITER include the reactor materials beryllium, carbon and tungsten as well as the plasma hydrogen isotopes. This means that interaction models, {\it i.e. interatomic potentials}, for this complicated quaternary system are needed. The task of finding such potentials is nonetheless nearly at its end, since models for the beryllium-carbon-hydrogen interactions were constructed in this thesis and as a continuation of that work, a beryllium-tungsten model is under development. These potentials are combinable with the earlier tungsten-carbon-hydrogen ones. The potentials were used to explain the chemical sputtering of beryllium due to deuterium plasma exposure. During experiments, a large fraction of the sputtered beryllium atoms were observed to be released as BeD molecules, and the simulations identified the swift chemical sputtering mechanism, previously not believed to be important in metals, as the underlying mechanism. Radiation damage in the reactor structural materials vanadium, iron and iron chromium, as well as in the wall material tungsten and the mixed alloy tungsten carbide, was also studied in this thesis. Interatomic potentials for vanadium, tungsten and iron were modified to be better suited for simulating collision cascades that are formed during particle irradiation, and the potential features affecting the resulting primary damage were identified. Including the often neglected electronic effects in the simulations was also shown to have an impact on the damage. With proper tuning of the electron-phonon interaction strength, experimentally measured quantities related to ion-beam mixing in iron could be reproduced. The damage in tungsten carbide alloys showed elemental asymmetry, as the major part of the damage consisted of carbon defects. On the other hand, modelling the damage in the iron chromium alloy, essentially representing steel, showed that small additions of chromium do not noticeably affect the primary damage in iron. Since a complete assessment of the response of a material in a future full-scale fusion reactor is not achievable using only experimental techniques, molecular dynamics simulations are of vital help. This thesis has not only provided insight into complicated reactor processes and improved current methods, but also offered tools for further simulations. It is therefore an important step towards making fusion energy more than a future goal.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cosmological inflation is the dominant paradigm in explaining the origin of structure in the universe. According to the inflationary scenario, there has been a period of nearly exponential expansion in the very early universe, long before the nucleosynthesis. Inflation is commonly considered as a consequence of some scalar field or fields whose energy density starts to dominate the universe. The inflationary expansion converts the quantum fluctuations of the fields into classical perturbations on superhorizon scales and these primordial perturbations are the seeds of the structure in the universe. Moreover, inflation also naturally explains the high degree of homogeneity and spatial flatness of the early universe. The real challenge of the inflationary cosmology lies in trying to establish a connection between the fields driving inflation and theories of particle physics. In this thesis we concentrate on inflationary models at scales well below the Planck scale. The low scale allows us to seek for candidates for the inflationary matter within extensions of the Standard Model but typically also implies fine-tuning problems. We discuss a low scale model where inflation is driven by a flat direction of the Minimally Supersymmetric Standard Model. The relation between the potential along the flat direction and the underlying supergravity model is studied. The low inflationary scale requires an extremely flat potential but we find that in this particular model the associated fine-tuning problems can be solved in a rather natural fashion in a class of supergravity models. For this class of models, the flatness is a consequence of the structure of the supergravity model and is insensitive to the vacuum expectation values of the fields that break supersymmetry. Another low scale model considered in the thesis is the curvaton scenario where the primordial perturbations originate from quantum fluctuations of a curvaton field, which is different from the fields driving inflation. The curvaton gives a negligible contribution to the total energy density during inflation but its perturbations become significant in the post-inflationary epoch. The separation between the fields driving inflation and the fields giving rise to primordial perturbations opens up new possibilities to lower the inflationary scale without introducing fine-tuning problems. The curvaton model typically gives rise to relatively large level of non-gaussian features in the statistics of primordial perturbations. We find that the level of non-gaussian effects is heavily dependent on the form of the curvaton potential. Future observations that provide more accurate information of the non-gaussian statistics can therefore place constraining bounds on the curvaton interactions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this thesis we examine multi-field inflationary models of the early Universe. Since non-Gaussianities may allow for the possibility to discriminate between models of inflation, we compute deviations from a Gaussian spectrum of primordial perturbations by extending the delta-N formalism. We use N-flation as a concrete model; our findings show that these models are generically indistinguishable as long as the slow roll approximation is still valid. Besides computing non-Guassinities, we also investigate Preheating after multi-field inflation. Within the framework of N-flation, we find that preheating via parametric resonance is suppressed, an indication that it is the old theory of preheating that is applicable. In addition to studying non-Gaussianities and preheatng in multi-field inflationary models, we study magnetogenesis in the early universe. To this aim, we propose a mechanism to generate primordial magnetic fields via rotating cosmic string loops. Magnetic fields in the micro-Gauss range have been observed in galaxies and clusters, but their origin has remained elusive. We consider a network of strings and find that rotating cosmic string loops, which are continuously produced in such networks, are viable candidates for magnetogenesis with relevant strength and length scales, provided we use a high string tension and an efficient dynamo.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis studies quantile residuals and uses different methodologies to develop test statistics that are applicable in evaluating linear and nonlinear time series models based on continuous distributions. Models based on mixtures of distributions are of special interest because it turns out that for those models traditional residuals, often referred to as Pearson's residuals, are not appropriate. As such models have become more and more popular in practice, especially with financial time series data there is a need for reliable diagnostic tools that can be used to evaluate them. The aim of the thesis is to show how such diagnostic tools can be obtained and used in model evaluation. The quantile residuals considered here are defined in such a way that, when the model is correctly specified and its parameters are consistently estimated, they are approximately independent with standard normal distribution. All the tests derived in the thesis are pure significance type tests and are theoretically sound in that they properly take the uncertainty caused by parameter estimation into account. -- In Chapter 2 a general framework based on the likelihood function and smooth functions of univariate quantile residuals is derived that can be used to obtain misspecification tests for various purposes. Three easy-to-use tests aimed at detecting non-normality, autocorrelation, and conditional heteroscedasticity in quantile residuals are formulated. It also turns out that these tests can be interpreted as Lagrange Multiplier or score tests so that they are asymptotically optimal against local alternatives. Chapter 3 extends the concept of quantile residuals to multivariate models. The framework of Chapter 2 is generalized and tests aimed at detecting non-normality, serial correlation, and conditional heteroscedasticity in multivariate quantile residuals are derived based on it. Score test interpretations are obtained for the serial correlation and conditional heteroscedasticity tests and in a rather restricted special case for the normality test. In Chapter 4 the tests are constructed using the empirical distribution function of quantile residuals. So-called Khmaladze s martingale transformation is applied in order to eliminate the uncertainty caused by parameter estimation. Various test statistics are considered so that critical bounds for histogram type plots as well as Quantile-Quantile and Probability-Probability type plots of quantile residuals are obtained. Chapters 2, 3, and 4 contain simulations and empirical examples which illustrate the finite sample size and power properties of the derived tests and also how the tests and related graphical tools based on residuals are applied in practice.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Pristine peatlands are carbon (C) accumulating wetland ecosystems sustained by a high water level (WL) and consequent anoxia that slows down decomposition. Persistent WL drawdown as a response to climate and/or land-use change directly affects decomposition: increased oxygenation stimulates decomposition of the old C (peat) sequestered under prior anoxic conditions. Responses of the new C (plant litter) in terms of quality, production and decomposability, and the consequences for the whole C cycle of peatlands are not fully understood. WL drawdown induces changes in plant community resulting in shift in dominance from Sphagnum and graminoids to shrubs and trees. There is increasing evidence that the indirect effects of WL drawdown via the changes in plant communities will have more impact on the ecosystem C cycling than any direct effects. The aim of this study is to disentangle the direct and indirect effects of WL drawdown on the new C by measuring the relative importance of 1) environmental parameters (WL depth, temperature, soil chemistry) and 2) plant community composition on litter production, microbial activity, litter decomposition rates and, consequently, on the C accumulation. This information is crucial for modelling C cycle under changing climate and/or land-use. The effects of WL drawdown were tested in a large-scale experiment with manipulated WL at two time scales and three nutrient regimes. Furthermore, the effect of climate on litter decomposability was tested along a north-south gradient. Additionally, a novel method for estimating litter chemical quality and decomposability was explored by combining Near infrared spectroscopy with multivariate modelling. WL drawdown had direct effects on litter quality, microbial community composition and activity and litter decomposition rates. However, the direct effects of WL drawdown were overruled by the indirect effects via changes in litter type composition and production. Short-term (years) responses to WL drawdown were small. In long-term (decades), dramatically increased litter inputs resulted in large accumulation of organic matter in spite of increased decomposition rates. Further, the quality of the accumulated matter greatly changed from that accumulated in pristine conditions. The response of a peatland ecosystem to persistent WL drawdown was more pronounced at sites with more nutrients. The study demonstrates that the shift in vegetation composition as a response to climate and/or land-use change is the main factor affecting peatland ecosystem C cycle and thus dynamic vegetation is a necessity in any models applied for estimating responses of C fluxes to changes in the environment. The time scale for vegetation changes caused by hydrological changes needs to extend to decades. This study provides grouping of litter types (plant species and part) into functional types based on their chemical quality and/or decomposability that the models could utilize. Further, the results clearly show a drop in soil temperature as a response to WL drawdown when an initially open peatland converts into a forest ecosystem, which has not yet been considered in the existing models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The question at issue in this dissertation is the epistemic role played by ecological generalizations and models. I investigate and analyze such properties of generalizations as lawlikeness, invariance, and stability, and I ask which of these properties are relevant in the context of scientific explanations. I will claim that there are generalizable and reliable causal explanations in ecology by generalizations, which are invariant and stable. An invariant generalization continues to hold or be valid under a special change called an intervention that changes the value of its variables. Whether a generalization remains invariant during its interventions is the criterion that determines whether it is explanatory. A generalization can be invariant and explanatory regardless of its lawlike status. Stability deals with a generality that has to do with holding of a generalization in possible background conditions. The more stable a generalization, the less dependent it is on background conditions to remain true. Although it is invariance rather than stability of generalizations that furnishes us with explanatory generalizations, there is an important function that stability has in this context of explanations, namely, stability furnishes us with extrapolability and reliability of scientific explanations. I also discuss non-empirical investigations of models that I call robustness and sensitivity analyses. I call sensitivity analyses investigations in which one model is studied with regard to its stability conditions by making changes and variations to the values of the model s parameters. As a general definition of robustness analyses I propose investigations of variations in modeling assumptions of different models of the same phenomenon in which the focus is on whether they produce similar or convergent results or not. Robustness and sensitivity analyses are powerful tools for studying the conditions and assumptions where models break down and they are especially powerful in pointing out reasons as to why they do this. They show which conditions or assumptions the results of models depend on. Key words: ecology, generalizations, invariance, lawlikeness, philosophy of science, robustness, explanation, models, stability

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Atmospheric particles affect the radiation balance of the Earth and thus the climate. New particle formation from nucleation has been observed in diverse atmospheric conditions but the actual formation path is still unknown. The prevailing conditions can be exploited to evaluate proposed formation mechanisms. This study aims to improve our understanding of new particle formation from the view of atmospheric conditions. The role of atmospheric conditions on particle formation was studied by atmospheric measurements, theoretical model simulations and simulations based on observations. Two separate column models were further developed for aerosol and chemical simulations. Model simulations allowed us to expand the study from local conditions to varying conditions in the atmospheric boundary layer, while the long-term measurements described especially characteristic mean conditions associated with new particle formation. The observations show statistically significant difference in meteorological and back-ground aerosol conditions between observed event and non-event days. New particle formation above boreal forest is associated with strong convective activity, low humidity and low condensation sink. The probability of a particle formation event is predicted by an equation formulated for upper boundary layer conditions. The model simulations call into question if kinetic sulphuric acid induced nucleation is the primary particle formation mechanism in the presence of organic vapours. Simultaneously the simulations show that ignoring spatial and temporal variation in new particle formation studies may lead to faulty conclusions. On the other hand, the theoretical simulations indicate that short-scale variations in temperature and humidity unlikely have a significant effect on mean binary water sulphuric acid nucleation rate. The study emphasizes the significance of mixing and fluxes in particle formation studies, especially in the atmospheric boundary layer. The further developed models allow extensive aerosol physical and chemical studies in the future.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Yhteenveto: Talvivirtaamien redukointi vesistömallien avulla

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study evaluates three different time units in option pricing: trading time, calendar time and continuous time using discrete approximations (CTDA). The CTDA-time model partitions the trading day into 30-minute intervals, where each interval is given a weight corresponding to the historical volatility in the respective interval. Furthermore, the non-trading volatility, both overnight and weekend volatility, is included in the first interval of the trading day in the CTDA model. The three models are tested on market prices. The results indicate that the trading-time model gives the best fit to market prices in line with the results of previous studies, but contrary to expectations under non-arbitrage option pricing. Under non-arbitrage pricing, the option premium should reflect the cost of hedging the expected volatility during the option’s remaining life. The study concludes that the historical patterns in volatility are not fully accounted for by the market, rather the market prices options closer to trading time.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Yhteenveto: Järvien ainetasemallien kehittäminen.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Yhteenveto: Lumimallit vesistöjen ennustemalleissa

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cosmopolitan ideals have been on the philosophical agenda for several millennia but the end of the Cold War started a new discussion on state sovereignty, global democracy, the role of international law and global institutions. The Westphalian state system in practice since the 17th century is transforming and the democracy deficit needs new solutions. An impetus has been the fact that in the present world, an international body representing global citizens does not exist. In this Master’s thesis, the possibility of establishing a world parliament is examined. In a case analysis, 17 models on world parliament from two journals, a volume of essays and two other publications are discussed. Based on general observations, the models are divided into four thematic groups. The models are analyzed with an emphasis on feasible and probable elements. Further, a new scenario with a time frame of thirty years is proposed based on the methodology of normative futures studies, taking special interest in causal relationships and actions leading to change. The scenario presents three gradual steps that each need to be realized before a sustainable world parliament is established. The theoretical framework is based on social constructivism, and changes in international and multi-level governance are examined with the concepts of globalization, democracy and sovereignty. A feasible, desirable and credible world parliament is constituted gradually by implying electoral, democratic and legal measures for members initially from exclusively democratic states, parliamentarians, non-governmental organizations and other groups. The parliament should be located outside the United Nations context, since a new body avoids the problem of inefficiency currently prevailing in the UN. The main objectives of the world parliament are to safeguard peace and international law and to offer legal advice in cases when international law has been violated. A feasible world parliament is advisory in the beginning but it is granted legislative powers in the future. The number of members in the world parliament could also be extended following the example of the EU enlargement process.