997 resultados para Projected models


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Interaction between forests and the atmosphere occurs by radiative and turbulent transport. The fluxes of energy and mass between surface and the atmosphere directly influence the properties of the lower atmosphere and in longer time scales the global climate. Boreal forest ecosystems are central in the global climate system, and its responses to human activities, because they are significant sources and sinks of greenhouse gases and of aerosol particles. The aim of the present work was to improve our understanding on the existing interplay between biologically active canopy, microenvironment and turbulent flow and quantify. In specific, the aim was to quantify the contribution of different canopy layers to whole forest fluxes. For this purpose, long-term micrometeorological and ecological measurements made in a Scots pine (Pinus sylvestris) forest at SMEAR II research station in Southern Finland were used. The properties of turbulent flow are strongly modified by the interaction between the canopy elements: momentum is efficiently absorbed in the upper layers of the canopy, mean wind speed and turbulence intensities decrease rapidly towards the forest floor and power spectra is modulated by spectral short-cut . In the relative open forest, diabatic stability above the canopy explained much of the changes in velocity statistics within the canopy except in strongly stable stratification. Large eddies, ranging from tens to hundred meters in size, were responsible for the major fraction of turbulent transport between a forest and the atmosphere. Because of this, the eddy-covariance (EC) method proved to be successful for measuring energy and mass exchange inside a forest canopy with exception of strongly stable conditions. Vertical variations of within canopy microclimate, light attenuation in particular, affect strongly the assimilation and transpiration rates. According to model simulations, assimilation rate decreases with height more rapidly than stomatal conductance (gs) and transpiration and, consequently, the vertical source-sink distributions for carbon dioxide (CO2) and water vapor (H2O) diverge. Upscaling from a shoot scale to canopy scale was found to be sensitive to chosen stomatal control description. The upscaled canopy level CO2 fluxes can vary as much as 15 % and H2O fluxes 30 % even if the gs models are calibrated against same leaf-level dataset. A pine forest has distinct overstory and understory layers, which both contribute significantly to canopy scale fluxes. The forest floor vegetation and soil accounted between 18 and 25 % of evapotranspiration and between 10 and 20 % of sensible heat exchange. Forest floor was also an important deposition surface for aerosol particles; between 10 and 35 % of dry deposition of particles within size range 10 30 nm occurred there. Because of the northern latitudes, seasonal cycle of climatic factors strongly influence the surface fluxes. Besides the seasonal constraints, partitioning of available energy to sensible and latent heat depends, through stomatal control, on the physiological state of the vegetation. In spring, available energy is consumed mainly as sensible heat and latent heat flux peaked about two months later, in July August. On the other hand, annual evapotranspiration remains rather stable over range of environmental conditions and thus any increase of accumulated radiation affects primarily the sensible heat exchange. Finally, autumn temperature had strong effect on ecosystem respiration but its influence on photosynthetic CO2 uptake was restricted by low radiation levels. Therefore, the projected autumn warming in the coming decades will presumably reduce the positive effects of earlier spring recovery in terms of carbon uptake potential of boreal forests.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper studies an ultrasonic wave dispersion characteristics of a nanorod. Nonlocal strain gradient models (both second and fourth order) are introduced to analyze the ultrasonic wave behavior in nanorod. Explicit expressions are derived for wave numbers and the wave speeds of the nanorod. The analysis shows that the fourth order strain gradient model gives approximate results over the second order strain gradient model for dynamic analysis. The second order strain gradient model gives a critical wave number at certain wave frequency, where the wave speeds are zero. A relation among the number of waves along the nanorod, the nonlocal scaling parameter (e(0)a), and the length of the nanorod is obtained from the nonlocal second order strain gradient model. The ultrasonic wave characteristics of the nanorod obtained from the nonlocal strain gradient models are compared with the classical continuum model. The dynamic response behavior of nanorods is explained from both the strain gradient models. The effect of e(0)a on the ultrasonic wave behavior of the nanorods is also observed. (C) 2010 American Institute of Physics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We study effective models of chiral fields and Polyakov loop expected to describe the dynamics responsible for the phase structure of two-flavor QCD at finite temperature and density. We consider chiral sector described either using linear sigma model or Nambu-Jona-Lasinio model and study the phase diagram and determine the location of the critical point as a function of the explicit chiral symmetry breaking (i.e. the bare quark mass $m_q$). We also discuss the possible emergence of the quarkyonic phase in this model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The question at issue in this dissertation is the epistemic role played by ecological generalizations and models. I investigate and analyze such properties of generalizations as lawlikeness, invariance, and stability, and I ask which of these properties are relevant in the context of scientific explanations. I will claim that there are generalizable and reliable causal explanations in ecology by generalizations, which are invariant and stable. An invariant generalization continues to hold or be valid under a special change called an intervention that changes the value of its variables. Whether a generalization remains invariant during its interventions is the criterion that determines whether it is explanatory. A generalization can be invariant and explanatory regardless of its lawlike status. Stability deals with a generality that has to do with holding of a generalization in possible background conditions. The more stable a generalization, the less dependent it is on background conditions to remain true. Although it is invariance rather than stability of generalizations that furnishes us with explanatory generalizations, there is an important function that stability has in this context of explanations, namely, stability furnishes us with extrapolability and reliability of scientific explanations. I also discuss non-empirical investigations of models that I call robustness and sensitivity analyses. I call sensitivity analyses investigations in which one model is studied with regard to its stability conditions by making changes and variations to the values of the model s parameters. As a general definition of robustness analyses I propose investigations of variations in modeling assumptions of different models of the same phenomenon in which the focus is on whether they produce similar or convergent results or not. Robustness and sensitivity analyses are powerful tools for studying the conditions and assumptions where models break down and they are especially powerful in pointing out reasons as to why they do this. They show which conditions or assumptions the results of models depend on. Key words: ecology, generalizations, invariance, lawlikeness, philosophy of science, robustness, explanation, models, stability

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Yhteenveto: Talvivirtaamien redukointi vesistömallien avulla

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We examine the stability of hadron resonance gas models by extending them to include undiscovered resonances through the Hagedorn formula. We find that the influence of unknown resonances on thermodynamics is large but bounded. We model the decays of resonances and investigate the ratios of particle yields in heavy-ion collisions. We find that observables such as hydrodynamics and hadron yield ratios change little upon extending the model. As a result, heavy-ion collisions at the RHIC and LHC are insensitive to a possible exponential rise in the hadronic density of states, thus increasing the stability of the predictions of hadron resonance gas models in this context. Hadron resonance gases are internally consistent up to a temperature higher than the crossover temperature in QCD, but by examining quark number susceptibilities we find that their region of applicability ends below the QCD crossover.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this article, the problem of two Unmanned Aerial Vehicles (UAVs) cooperatively searching an unknown region is addressed. The search region is discretized into hexagonal cells and each cell is assumed to possess an uncertainty value. The UAVs have to cooperatively search these cells taking limited endurance, sensor and communication range constraints into account. Due to limited endurance, the UAVs need to return to the base station for refuelling and also need to select a base station when multiple base stations are present. This article proposes a route planning algorithm that takes endurance time constraints into account and uses game theoretical strategies to reduce the uncertainty. The route planning algorithm selects only those cells that ensure the agent will return to any one of the available bases. A set of paths are formed using these cells which the game theoretical strategies use to select a path that yields maximum uncertainty reduction. We explore non-cooperative Nash, cooperative and security strategies from game theory to enhance the search effectiveness. Monte-Carlo simulations are carried out which show the superiority of the game theoretical strategies over greedy strategy for different look ahead step length paths. Within the game theoretical strategies, non-cooperative Nash and cooperative strategy perform similarly in an ideal case, but Nash strategy performs better than the cooperative strategy when the perceived information is different. We also propose a heuristic based on partitioning of the search space into sectors to reduce computational overhead without performance degradation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the thesis we consider inference for cointegration in vector autoregressive (VAR) models. The thesis consists of an introduction and four papers. The first paper proposes a new test for cointegration in VAR models that is directly based on the eigenvalues of the least squares (LS) estimate of the autoregressive matrix. In the second paper we compare a small sample correction for the likelihood ratio (LR) test of cointegrating rank and the bootstrap. The simulation experiments show that the bootstrap works very well in practice and dominates the correction factor. The tests are applied to international stock prices data, and the .nite sample performance of the tests are investigated by simulating the data. The third paper studies the demand for money in Sweden 1970—2000 using the I(2) model. In the fourth paper we re-examine the evidence of cointegration between international stock prices. The paper shows that some of the previous empirical results can be explained by the small-sample bias and size distortion of Johansen’s LR tests for cointegration. In all papers we work with two data sets. The first data set is a Swedish money demand data set with observations on the money stock, the consumer price index, gross domestic product (GDP), the short-term interest rate and the long-term interest rate. The data are quarterly and the sample period is 1970(1)—2000(1). The second data set consists of month-end stock market index observations for Finland, France, Germany, Sweden, the United Kingdom and the United States from 1980(1) to 1997(2). Both data sets are typical of the sample sizes encountered in economic data, and the applications illustrate the usefulness of the models and tests discussed in the thesis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Non-Gaussianity of signals/noise often results in significant performance degradation for systems, which are designed using the Gaussian assumption. So non-Gaussian signals/noise require a different modelling and processing approach. In this paper, we discuss a new Bayesian estimation technique for non-Gaussian signals corrupted by colored non Gaussian noise. The method is based on using zero mean finite Gaussian Mixture Models (GMMs) for signal and noise. The estimation is done using an adaptive non-causal nonlinear filtering technique. The method involves deriving an estimator in terms of the GMM parameters, which are in turn estimated using the EM algorithm. The proposed filter is of finite length and offers computational feasibility. The simulations show that the proposed method gives a significant improvement compared to the linear filter for a wide variety of noise conditions, including impulsive noise. We also claim that the estimation of signal using the correlation with past and future samples leads to reduced mean squared error as compared to signal estimation based on past samples only.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study examined the effects of the Greeks of the options and the trading results of delta hedging strategies, with three different time units or option-pricing models. These time units were calendar time, trading time and continuous time using discrete approximation (CTDA) time. The CTDA time model is a pricing model, that among others accounts for intraday and weekend, patterns in volatility. For the CTDA time model some additional theta measures, which were believed to be usable in trading, were developed. The study appears to verify that there were differences in the Greeks with different time units. It also revealed that these differences influence the delta hedging of options or portfolios. Although it is difficult to say anything about which is the most usable of the different time models, as this much depends on the traders view of the passing of time, different market conditions and different portfolios, the CTDA time model can be viewed as an attractive alternative.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper examines how volatility in financial markets can preferable be modeled. The examination investigates how good the models for the volatility, both linear and nonlinear, are in absorbing skewness and kurtosis. The examination is done on the Nordic stock markets, including Finland, Sweden, Norway and Denmark. Different linear and nonlinear models are applied, and the results indicates that a linear model can almost always be used for modeling the series under investigation, even though nonlinear models performs slightly better in some cases. These results indicate that the markets under study are exposed to asymmetric patterns only to a certain degree. Negative shocks generally have a more prominent effect on the markets, but these effects are not really strong. However, in terms of absorbing skewness and kurtosis, nonlinear models outperform linear ones.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study evaluates three different time units in option pricing: trading time, calendar time and continuous time using discrete approximations (CTDA). The CTDA-time model partitions the trading day into 30-minute intervals, where each interval is given a weight corresponding to the historical volatility in the respective interval. Furthermore, the non-trading volatility, both overnight and weekend volatility, is included in the first interval of the trading day in the CTDA model. The three models are tested on market prices. The results indicate that the trading-time model gives the best fit to market prices in line with the results of previous studies, but contrary to expectations under non-arbitrage option pricing. Under non-arbitrage pricing, the option premium should reflect the cost of hedging the expected volatility during the option’s remaining life. The study concludes that the historical patterns in volatility are not fully accounted for by the market, rather the market prices options closer to trading time.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We provide a survey of some of our recent results ([9], [13], [4], [6], [7]) on the analytical performance modeling of IEEE 802.11 wireless local area networks (WLANs). We first present extensions of the decoupling approach of Bianchi ([1]) to the saturation analysis of IEEE 802.11e networks with multiple traffic classes. We have found that even when analysing WLANs with unsaturated nodes the following state dependent service model works well: when a certain set of nodes is nonempty, their channel attempt behaviour is obtained from the corresponding fixed point analysis of the saturated system. We will present our experiences in using this approximation to model multimedia traffic over an IEEE 802.11e network using the enhanced DCF channel access (EDCA) mechanism. We have found that we can model TCP controlled file transfers, VoIP packet telephony, and streaming video in the IEEE802.11e setting by this simple approximation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The increasing use of 3D modeling of Human Face in Face Recognition systems, User Interfaces, Graphics, Gaming and the like has made it an area of active study. Majority of the 3D sensors rely on color coded light projection for 3D estimation. Such systems fail to generate any response in regions covered by Facial Hair (like beard, mustache), and hence generate holes in the model which have to be filled manually later on. We propose the use of wavelet transform based analysis to extract the 3D model of Human Faces from a sinusoidal white light fringe projected image. Our method requires only a single image as input. The method is robust to texture variations on the face due to space-frequency localization property of the wavelet transform. It can generate models to pixel level refinement as the phase is estimated for each pixel by a continuous wavelet transform. In cases of sparse Facial Hair, the shape distortions due to hairs can be filtered out, yielding an estimate for the underlying face. We use a low-pass filtering approach to estimate the face texture from the same image. We demonstrate the method on several Human Faces both with and without Facial Hairs. Unseen views of the face are generated by texture mapping on different rotations of the obtained 3D structure. To the best of our knowledge, this is the first attempt to estimate 3D for Human Faces in presence of Facial hair structures like beard and mustache without generating holes in those areas.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Processor architects have a challenging task of evaluating a large design space consisting of several interacting parameters and optimizations. In order to assist architects in making crucial design decisions, we build linear regression models that relate Processor performance to micro-architecture parameters, using simulation based experiments. We obtain good approximate models using an iterative process in which Akaike's information criteria is used to extract a good linear model from a small set of simulations, and limited further simulation is guided by the model using D-optimal experimental designs. The iterative process is repeated until desired error bounds are achieved. We used this procedure to establish the relationship of the CPI performance response to 26 key micro-architectural parameters using a detailed cycle-by-cycle superscalar processor simulator The resulting models provide a significance ordering on all micro-architectural parameters and their interactions, and explain the performance variations of micro-architectural techniques.