868 resultados para kernel estimators
Resumo:
Functional and technological properties of wheat depend on its chemical composition, which together with structural and microscopic characteristics, define flour quality. The aim of the present study was to characterize four Brazilian wheat cultivars (BRS Louro, BRS Timbauva, BRS Guamirim and BRS Pardela) and their respective flours in order to indicate specific technological applications. Kernels were analyzed for test weight, thousand kernel weight, hardness, moisture, and water activity. Flours were analyzed for water activity, color, centesimal composition, total dietary fiber, amylose content and identification of high molecular weight glutenins. The rheological properties of the flours were estimated by farinography, extensography, falling number, rapid visco amylography, and glutomatic and glutork equipment. Baking tests and scanning electron microscopy were also performed. The data were subjected to analysis of variance and principal component analysis. BRS Timbauva and BRS Guamirim presented results that did not allow for specific technological application. On the other hand, BRS Louro presented suitable characteristics for the elaboration of products with low dough strength such as cakes, pies and biscuits, while BRS Pardela seemed suitable for bread and pasta products.
Resumo:
The hydration kinetics of transgenic corn types flint DKB 245PRO, semi-flint DKB 390PRO, and dent DKB 240PRO was studied at temperatures of 30, 40, 50, and 67 °C. The concentrated parameters model was used, and it fits the experimental data well for all three cultivars. The chemical composition of the corn kernels was also evaluated. The corn cultivar influenced the initial rate of absorption and the water equilibrium concentration, and the dent corn absorbed more water than the other cultivars at the four temperatures analyzed. The effect of hydration on the kernel texture was also studied, and it was observed that there was no significant difference in the deformation force required for all three corn types analyzed with longer hydration period.
Resumo:
Genotype (G), environment (E) and their interaction (GEI) play an important role in the final expression of grain yield and quality attributes. A multi-environment trial in wheat was conducted to evaluate the magnitude of G, E and GEI effects on grain yield and quality of wheat genotypes under the three rainfed locations (hereafter environment) of Central Anatolian Plateau of Turkey, during the 2012-2013 cropping season. Grain yield (GY) and analyses of test weight (TW), protein content (PC), wet gluten content (WGC), grain hardness (GH), thousand kernel weight (TKW) and Zeleny sedimentation volume (ZSV) were determined. Allelic variations of high and low molecular weight glutenin subunits (HMW-GS and LMW-GS) and 1B/1R translocation were determined in all genotypes evaluated. Both HMW-Glu-1, 17+18, 5+10 and LMW-Glu-3 b, b, b corresponded to genotypes possessing medium to good quality attributes. Large variability was found among most of the quality attributes evaluated; wider ranges of quality traits were observed in the environments than among the genotypes. The importance of the growing environment effects on grain quality was proved, suggesting that breeders' quality objectives should be adapted to the targeted environments.
Resumo:
AbstractThe Atlantic Forest has species of native fruits, consumed fresh and processed, which have an important contribution to food sovereignty of families that consume it. This study examined the physical and physicochemical characteristics, proximate composition, concentration of carotenoids, vitamin C, vitamin E and minerals in the pulp and kernels of fruits of licuri (Syagrus coronata (Mart.) Becc.). Titratable acidity was analyzed by volumetric neutralization, soluble solids by refractometry, proteins by the micro-Kjeldahl method, lipids by gravimetry using soxhlet, dietary fiber by non-enzymatic gravimetry, carotenoids and vitamin C by HPLC-DAD, vitamin E by HPLC-fluorescence, and minerals by ICP-AES. Pulp were a source of Zn (0.95 mg 100–1), a good source of fiber (6.15 g 100–1), excellent source of provitamin A (758.75 RAE 100–1), Cu (0.69 mg 100–1), Fe (3.81 mg 100–1), Mn (3.40 mg 100–1) and Mo (0.06 mg 100–1). The kernel were a source of Fe (3.36 mg 100–1) and excellent source of Mn (6.14 mg 100–1), Cu (0.97 mg 100–1) and Mo (0.07 mg 100–1). The nutritional value and wide availability of licuri fruit make it an important resource for reducing food insecurity and improving nutrition of the rural population and other individuals who have access to it.
Resumo:
There are many opportunities to utilise coconut in Nzema to support farmers. Coconut oil that is mainly used for food preparation in Nzema can be utilized as fuel to support overcoming of the energy crisis in the Ghana. Coconut oil in Nzema is not used in both transportation and electricity generation. A few of the waste husk and shell are mainly used as fuel in homes for heating but greater amount is left to rot or burn the coconut plantation. In addition, some portion of the granulated coconut kernel is sometime used as feed for piggery feed and the rest of the granulated kernel are left as waste on the oil processing site. In this thesis, the author identified alternative utilization of cocoanut, for instance the use of coconut husk and shell for charcoal production, and the use of coconut trunks as construction materials. It is envisaged that exploring these alternatives will not only reduce carbon emission in the country but will also contribute significantly to the sustainability of the local agro-industry.
Resumo:
This work investigates theoretical properties of symmetric and anti-symmetric kernels. First chapters give an overview of the theory of kernels used in supervised machine learning. Central focus is on the regularized least squares algorithm, which is motivated as a problem of function reconstruction through an abstract inverse problem. Brief review of reproducing kernel Hilbert spaces shows how kernels define an implicit hypothesis space with multiple equivalent characterizations and how this space may be modified by incorporating prior knowledge. Mathematical results of the abstract inverse problem, in particular spectral properties, pseudoinverse and regularization are recollected and then specialized to kernels. Symmetric and anti-symmetric kernels are applied in relation learning problems which incorporate prior knowledge that the relation is symmetric or anti-symmetric, respectively. Theoretical properties of these kernels are proved in a draft this thesis is based on and comprehensively referenced here. These proofs show that these kernels can be guaranteed to learn only symmetric or anti-symmetric relations, and they can learn any relations relative to the original kernel modified to learn only symmetric or anti-symmetric parts. Further results prove spectral properties of these kernels, central result being a simple inequality for the the trace of the estimator, also called the effective dimension. This quantity is used in learning bounds to guarantee smaller variance.
Resumo:
Many, if not all, aspects of our everyday lives are related to computers and control. Microprocessors and wireless communications are involved in our lives. Embedded systems are an attracting field because they combine three key factors, small size, low power consumption and high computing capabilities. The aim of this thesis is to study how Linux communicates with the hardware, to answer the question if it is possible to use an operating system like Debian for embedded systems and finally, to build a Mechatronic real time application. In the thesis a presentation of Linux and the Xenomai real time patch is given, the bootloader and communication with the hardware is analyzed. BeagleBone the evaluation board is presented along with the application project consisted of a robot cart with a driver circuit, a line sensor reading a black line and two Xbee antennas. It makes use of Xenomai threads, the real time kernel. According to the obtained results, Linux is able to operate as a real time operating system. The issue of future research is the area of embedded Linux is also discussed.
Resumo:
The moisture content of peanut kernel (Arachis hypogaea L.) at digging ranges from 30 to 50% on a wet basis (w.b.). The seed moisture content must be reduced to 10.5% or below before seeds can be graded and marketed. After digging, peanuts are cured on a window sill for two to five days then mechanically separated from the vine. Heated air is used to further dry the peanuts from approximately 18 to 10% moisture content w.b. Drying is required to maintain peanut seed and grain quality. Traditional dryers pass a high temperature and high humidity air stream through the seed mass. The drying time is long because the system is inefficient and the high temperature increases the risk of thermal damage to the kernels. New technology identified as heat pipe technology (HPT) is available and has the unique feature of removing the moisture from the air stream before it is heated and passed through the seed. A study was conducted to evaluate the performance of the HPT system in drying peanut seed. The seeds inside the shells were dried from 17.4 to 7.3% in 14 hours and 11 minutes, with a rate of moisture removal of 0.71% mc per hour. This drying process caused no reduction in seed quality as measured by the standard germination, accelerated ageing and field emergence tests. It was concluded that the HPT system is a promising technology for drying peanut seed when efficiency and maintenance of physiological quality are desired.
Stochastic particle models: mean reversion and burgers dynamics. An application to commodity markets
Resumo:
The aim of this study is to propose a stochastic model for commodity markets linked with the Burgers equation from fluid dynamics. We construct a stochastic particles method for commodity markets, in which particles represent market participants. A discontinuity in the model is included through an interacting kernel equal to the Heaviside function and its link with the Burgers equation is given. The Burgers equation and the connection of this model with stochastic differential equations are also studied. Further, based on the law of large numbers, we prove the convergence, for large N, of a system of stochastic differential equations describing the evolution of the prices of N traders to a deterministic partial differential equation of Burgers type. Numerical experiments highlight the success of the new proposal in modeling some commodity markets, and this is confirmed by the ability of the model to reproduce price spikes when their effects occur in a sufficiently long period of time.
Resumo:
Four problems of physical interest have been solved in this thesis using the path integral formalism. Using the trigonometric expansion method of Burton and de Borde (1955), we found the kernel for two interacting one dimensional oscillators• The result is the same as one would obtain using a normal coordinate transformation, We next introduced the method of Papadopolous (1969), which is a systematic perturbation type method specifically geared to finding the partition function Z, or equivalently, the Helmholtz free energy F, of a system of interacting oscillators. We applied this method to the next three problems considered• First, by summing the perturbation expansion, we found F for a system of N interacting Einstein oscillators^ The result obtained is the same as the usual result obtained by Shukla and Muller (1972) • Next, we found F to 0(Xi)f where A is the usual Tan Hove ordering parameter* The results obtained are the same as those of Shukla and Oowley (1971), who have used a diagrammatic procedure, and did the necessary sums in Fourier space* We performed the work in temperature space• Finally, slightly modifying the method of Papadopolous, we found the finite temperature expressions for the Debyecaller factor in Bravais lattices, to 0(AZ) and u(/K/ j,where K is the scattering vector* The high temperature limit of the expressions obtained here, are in complete agreement with the classical results of Maradudin and Flinn (1963) .
Resumo:
The infinitesimal differential quantum Monte Carlo (QMC) technique is used to estimate electrostatic polarizabilities of the H and He atoms up to the sixth order in the electric field perturbation. All 542 different QMC estimators of the nonzero atomic polarizabilities are derived and used in order to decrease the statistical error and to obtain the maximum efficiency of the simulations. We are confident that the estimates are "exact" (free of systematic error): the two atoms are nodeless systems, hence no fixed-node error is introduced. Furthermore, we develope and use techniques which eliminate systematic error inherent when extrapolating our results to zero time-step and large stack-size. The QMC results are consistent with published accurate values obtained using perturbation methods. The precision is found to be related to the number of perturbations, varying from 2 to 4 significant digits.
Resumo:
Latent variable models in finance originate both from asset pricing theory and time series analysis. These two strands of literature appeal to two different concepts of latent structures, which are both useful to reduce the dimension of a statistical model specified for a multivariate time series of asset prices. In the CAPM or APT beta pricing models, the dimension reduction is cross-sectional in nature, while in time-series state-space models, dimension is reduced longitudinally by assuming conditional independence between consecutive returns, given a small number of state variables. In this paper, we use the concept of Stochastic Discount Factor (SDF) or pricing kernel as a unifying principle to integrate these two concepts of latent variables. Beta pricing relations amount to characterize the factors as a basis of a vectorial space for the SDF. The coefficients of the SDF with respect to the factors are specified as deterministic functions of some state variables which summarize their dynamics. In beta pricing models, it is often said that only the factorial risk is compensated since the remaining idiosyncratic risk is diversifiable. Implicitly, this argument can be interpreted as a conditional cross-sectional factor structure, that is, a conditional independence between contemporaneous returns of a large number of assets, given a small number of factors, like in standard Factor Analysis. We provide this unifying analysis in the context of conditional equilibrium beta pricing as well as asset pricing with stochastic volatility, stochastic interest rates and other state variables. We address the general issue of econometric specifications of dynamic asset pricing models, which cover the modern literature on conditionally heteroskedastic factor models as well as equilibrium-based asset pricing models with an intertemporal specification of preferences and market fundamentals. We interpret various instantaneous causality relationships between state variables and market fundamentals as leverage effects and discuss their central role relative to the validity of standard CAPM-like stock pricing and preference-free option pricing.
Resumo:
This paper studies seemingly unrelated linear models with integrated regressors and stationary errors. By adding leads and lags of the first differences of the regressors and estimating this augmented dynamic regression model by feasible generalized least squares using the long-run covariance matrix, we obtain an efficient estimator of the cointegrating vector that has a limiting mixed normal distribution. Simulation results suggest that this new estimator compares favorably with others already proposed in the literature. We apply these new estimators to the testing of purchasing power parity (PPP) among the G-7 countries. The test based on the efficient estimates rejects the PPP hypothesis for most countries.
Resumo:
In this paper, we study several tests for the equality of two unknown distributions. Two are based on empirical distribution functions, three others on nonparametric probability density estimates, and the last ones on differences between sample moments. We suggest controlling the size of such tests (under nonparametric assumptions) by using permutational versions of the tests jointly with the method of Monte Carlo tests properly adjusted to deal with discrete distributions. We also propose a combined test procedure, whose level is again perfectly controlled through the Monte Carlo test technique and has better power properties than the individual tests that are combined. Finally, in a simulation experiment, we show that the technique suggested provides perfect control of test size and that the new tests proposed can yield sizeable power improvements.
Resumo:
The focus of the paper is the nonparametric estimation of an instrumental regression function P defined by conditional moment restrictions stemming from a structural econometric model : E[Y-P(Z)|W]=0 and involving endogenous variables Y and Z and instruments W. The function P is the solution of an ill-posed inverse problem and we propose an estimation procedure based on Tikhonov regularization. The paper analyses identification and overidentification of this model and presents asymptotic properties of the estimated nonparametric instrumental regression function.