995 resultados para Stochastic Integral
Resumo:
Maize is one of the most important crops in the world. The products generated from this crop are largely used in the starch industry, the animal and human nutrition sector, and biomass energy production and refineries. For these reasons, there is much interest in figuring the potential grain yield of maize genotypes in relation to the environment in which they will be grown, as the productivity directly affects agribusiness or farm profitability. Questions like these can be investigated with ecophysiological crop models, which can be organized according to different philosophies and structures. The main objective of this work is to conceptualize a stochastic model for predicting maize grain yield and productivity under different conditions of water supply while considering the uncertainties of daily climate data. Therefore, one focus is to explain the model construction in detail, and the other is to present some results in light of the philosophy adopted. A deterministic model was built as the basis for the stochastic model. The former performed well in terms of the curve shape of the above-ground dry matter over time as well as the grain yield under full and moderate water deficit conditions. Through the use of a triangular distribution for the harvest index and a bivariate normal distribution of the averaged daily solar radiation and air temperature, the stochastic model satisfactorily simulated grain productivity, i.e., it was found that 10,604 kg ha(-1) is the most likely grain productivity, very similar to the productivity simulated by the deterministic model and for the real conditions based on a field experiment. © 2012 American Society of Agricultural and Biological Engineers.
Resumo:
The third-kind linear integral equation Image where g(t) vanishes at a finite number of points in (a, b), is considered. In general, the Fredholm Alternative theory [[5.]] does not hold good for this type of integral equation. However, imposing certain conditions on g(t) and K(t, t′), the above integral equation was shown [[1.], 49–57] to obey a Fredholm-type theory, except for a certain class of kernels for which the question was left open. In this note a theory is presented for the equation under consideration with some additional assumptions on such kernels.
Resumo:
A transformation is suggested which can transform a non-Gaussian monthly hydrological time series into a Gaussian one. The suggested approach is verified with data of ten Indian rainfall time series. Incidentally, it is observed that once the deterministic trends are removed, the transformation leads to an uncorrelated process for monthly rainfall. The procedure for normalization is general enough in that it should be also applicable to river discharges. This is verified to a limited extent by considering data of two Indian river discharges.
Resumo:
Mit einer direkten Methode, bei der der Erdelyi-Kober- und der modifizierte Hankel-Operator Anwendung finden, werden gewisse Systeme aus zwei bzw. drei Paaren dualer Integralgleichungen mit Bessel-Kernen in geschlossener Form gelöst. Für bestimmte Funktionenklassen und Ordnungen der Bessel-Funktionen ist die Vorgehensweise angebrachter und geeigneter als die bereits existierenden Methoden.
Resumo:
Measurement of individual emission sources (e.g., animals or pen manure) within intensive livestock enterprises is necessary to test emission calculation protocols and to identify targets for decreased emissions. In this study, a vented, fabric-covered large chamber (4.5 × 4.5 m, 1.5 m high; encompassing greater spatial variability than a smaller chamber) in combination with on-line analysis (nitrous oxide [N2O] and methane [CH4] via Fourier Transform Infrared Spectroscopy; 1 analysis min-1) was tested as a means to isolate and measure emissions from beef feedlot pen manure sources. An exponential model relating chamber concentrations to ambient gas concentrations, air exchange (e.g., due to poor sealing with the surface; model linear when ≈ 0 m3 s-1), and chamber dimensions allowed data to be fitted with high confidence. Alternating manure source emission measurements using the large-chamber and the backward Lagrangian stochastic (bLS) technique (5-mo period; bLS validated via tracer gas release, recovery 94-104%) produced comparable N2O and CH4 emission values (no significant difference at P < 0.05). Greater precision of individual measurements was achieved via the large chamber than for the bLS (mean ± standard error of variance components: bLS half-hour measurements, 99.5 ± 325 mg CH4 s-1 and 9.26 ± 20.6 mg N2O s-1; large-chamber measurements, 99.6 ± 64.2 mg CH4 s-1 and 8.18 ± 0.3 mg N2O s-1). The large-chamber design is suitable for measurement of emissions from manure on pen surfaces, isolating these emissions from surrounding emission sources, including enteric emissions. © © American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America.
Resumo:
In irrigated cropping, as with any other industry, profit and risk are inter-dependent. An increase in profit would normally coincide with an increase in risk, and this means that risk can be traded for profit. It is desirable to manage a farm so that it achieves the maximum possible profit for the desired level of risk. This paper identifies risk-efficient cropping strategies that allocate land and water between crop enterprises for a case study of an irrigated farm in Southern Queensland, Australia. This is achieved by applying stochastic frontier analysis to the output of a simulation experiment. The simulation experiment involved changes to the levels of business risk by systematically varying the crop sowing rules in a bioeconomic model of the case study farm. This model utilises the multi-field capability of the process based Agricultural Production System Simulator (APSIM) and is parameterised using data collected from interviews with a collaborating farmer. We found sowing rules that increased the farm area sown to cotton caused the greatest increase in risk-efficiency. Increasing maize area also improved risk-efficiency but to a lesser extent than cotton. Sowing rules that increased the areas sown to wheat reduced the risk-efficiency of the farm business. Sowing rules were identified that had the potential to improve the expected farm profit by ca. $50,000 Annually, without significantly increasing risk. The concept of the shadow price of risk is discussed and an expression is derived from the estimated frontier equation that quantifies the trade-off between profit and risk.
Resumo:
The paper describes a method to determine the integral fringe order associated with a fractional fringe value that is measured using Tardy or other similar compensators. The method makes use of two different wavelengths of light to determine the fractional fringe values. Further, it does not assume the independence of the material fringe constant on the wavelength of light used. From these measured fractional fringe values, the associated integral fringe order is determined. A method to construct a ready-reckoner table is also described which helps to identify the integral fringe order from any two measured fractional fringe values.
Resumo:
The monograph dissertation deals with kernel integral operators and their mapping properties on Euclidean domains. The associated kernels are weakly singular and examples of such are given by Green functions of certain elliptic partial differential equations. It is well known that mapping properties of the corresponding Green operators can be used to deduce a priori estimates for the solutions of these equations. In the dissertation, natural size- and cancellation conditions are quantified for kernels defined in domains. These kernels induce integral operators which are then composed with any partial differential operator of prescribed order, depending on the size of the kernel. The main object of study in this dissertation being the boundedness properties of such compositions, the main result is the characterization of their Lp-boundedness on suitably regular domains. In case the aforementioned kernels are defined in the whole Euclidean space, their partial derivatives of prescribed order turn out to be so called standard kernels that arise in connection with singular integral operators. The Lp-boundedness of singular integrals is characterized by the T1 theorem, which is originally due to David and Journé and was published in 1984 (Ann. of Math. 120). The main result in the dissertation can be interpreted as a T1 theorem for weakly singular integral operators. The dissertation deals also with special convolution type weakly singular integral operators that are defined on Euclidean spaces.
Resumo:
We study integral representations of Gaussian processes with a pre-specified law in terms of other Gaussian processes. The dissertation consists of an introduction and of four research articles. In the introduction, we provide an overview about Volterra Gaussian processes in general, and fractional Brownian motion in particular. In the first article, we derive a finite interval integral transformation, which changes fractional Brownian motion with a given Hurst index into fractional Brownian motion with an other Hurst index. Based on this transformation, we construct a prelimit which formally converges to an analogous, infinite interval integral transformation. In the second article, we prove this convergence rigorously and show that the infinite interval transformation is a direct consequence of the finite interval transformation. In the third article, we consider general Volterra Gaussian processes. We derive measure-preserving transformations of these processes and their inherently related bridges. Also, as a related result, we obtain a Fourier-Laguerre series expansion for the first Wiener chaos of a Gaussian martingale. In the fourth article, we derive a class of ergodic transformations of self-similar Volterra Gaussian processes.
Resumo:
The stochastic filtering has been in general an estimation of indirectly observed states given observed data. This means that one is discussing conditional expected values as being one of the most accurate estimation, given the observations in the context of probability space. In my thesis, I have presented the theory of filtering using two different kind of observation process: the first one is a diffusion process which is discussed in the first chapter, while the third chapter introduces the latter which is a counting process. The majority of the fundamental results of the stochastic filtering is stated in form of interesting equations, such the unnormalized Zakai equation that leads to the Kushner-Stratonovich equation. The latter one which is known also by the normalized Zakai equation or equally by Fujisaki-Kallianpur-Kunita (FKK) equation, shows the divergence between the estimate using a diffusion process and a counting process. I have also introduced an example for the linear gaussian case, which is mainly the concept to build the so-called Kalman-Bucy filter. As the unnormalized and the normalized Zakai equations are in terms of the conditional distribution, a density of these distributions will be developed through these equations and stated by Kushner Theorem. However, Kushner Theorem has a form of a stochastic partial differential equation that needs to be verify in the sense of the existence and uniqueness of its solution, which is covered in the second chapter.
Resumo:
Abstract is not available.
Resumo:
Minimum Description Length (MDL) is an information-theoretic principle that can be used for model selection and other statistical inference tasks. There are various ways to use the principle in practice. One theoretically valid way is to use the normalized maximum likelihood (NML) criterion. Due to computational difficulties, this approach has not been used very often. This thesis presents efficient floating-point algorithms that make it possible to compute the NML for multinomial, Naive Bayes and Bayesian forest models. None of the presented algorithms rely on asymptotic analysis and with the first two model classes we also discuss how to compute exact rational number solutions.
Resumo:
The Minimum Description Length (MDL) principle is a general, well-founded theoretical formalization of statistical modeling. The most important notion of MDL is the stochastic complexity, which can be interpreted as the shortest description length of a given sample of data relative to a model class. The exact definition of the stochastic complexity has gone through several evolutionary steps. The latest instantation is based on the so-called Normalized Maximum Likelihood (NML) distribution which has been shown to possess several important theoretical properties. However, the applications of this modern version of the MDL have been quite rare because of computational complexity problems, i.e., for discrete data, the definition of NML involves an exponential sum, and in the case of continuous data, a multi-dimensional integral usually infeasible to evaluate or even approximate accurately. In this doctoral dissertation, we present mathematical techniques for computing NML efficiently for some model families involving discrete data. We also show how these techniques can be used to apply MDL in two practical applications: histogram density estimation and clustering of multi-dimensional data.