32 resultados para Two-stage stochastic model


Relevância:

100.00% 100.00%

Publicador:

Resumo:

A method is proposed to determine the extent of degradation in the rumen involving a two-stage mathematical modeling process. In the first stage, a statistical model shifts (or maps) the gas accumulation profile obtained using a fecal inoculum to a ruminal gas profile. Then, a kinetic model determines the extent of degradation in the rumen from the shifted profile. The kinetic model is presented as a generalized mathematical function, allowing any one of a number of alternative equation forms to be selected. This method might allow the gas production technique to become an approach for determining extent of degradation in the rumen, decreasing the need for surgically modified animals while still maintaining the link with the animal. Further research is needed before the proposed methodology can be used as a standard method across a range of feeds.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Isothermal titration microcalorimetry (ITC) has been applied to investigate protein-tannin interactions. Two hydrolyzable tannins were studied, namely myrabolan and tara tannins, for their interaction with bovine serum albumin (BSA), a model globular protein, and gelatin, a model proline-rich random coil protein. Calorimetry data indicate that protein-tannin interaction mechanisms are dependent upon the nature of the protein involved. Tannins apparently interact nonspecifically with the globular BSA, leading to binding saturation at estimated tannin/BSA molar ratios of 48:1 for tara- and 178:1 for myrabolan tannins. Tannins bind to the random coil protein gelatin by a two-stage mechanism. The energetics of the first stage show evidence for cooperative binding of tannins to the protein, while the second stage indicates gradual saturation of binding sites as observed for interaction with BSA. The structure and flexibility of the tannins themselves alters the stoichiometry of the interaction, but does not appear to have any significant affect on the overall binding mechanism observed. This study demonstrates the potential of ITC for providing an insight into the nature of protein-tannin interactions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Analyses of high-density single-nucleotide polymorphism (SNP) data, such as genetic mapping and linkage disequilibrium (LD) studies, require phase-known haplotypes to allow for the correlation between tightly linked loci. However, current SNP genotyping technology cannot determine phase, which must be inferred statistically. In this paper, we present a new Bayesian Markov chain Monte Carlo (MCMC) algorithm for population haplotype frequency estimation, particulary in the context of LD assessment. The novel feature of the method is the incorporation of a log-linear prior model for population haplotype frequencies. We present simulations to suggest that 1) the log-linear prior model is more appropriate than the standard coalescent process in the presence of recombination (>0.02cM between adjacent loci), and 2) there is substantial inflation in measures of LD obtained by a "two-stage" approach to the analysis by treating the "best" haplotype configuration as correct, without regard to uncertainty in the recombination process. Genet Epidemiol 25:106-114, 2003. (C) 2003 Wiley-Liss, Inc.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Gluco-oligosaccharides produced by Gluconobacter oxydans NCIMB 4943 from maltodextrin as the source, were evaluated for their fermentability by the human colonic microflora. The selectivity of growth of desirable bacteria in the human colon was studied in a three-stage continuous model of the human large intestine. Populations of bacteria, and their fluctuations as a response to the fermentation, were enumerated using fluorescent in situ hybridization (FISH). The gluco-oligosaccharides resulted in increases in numbers of bifidobacteria and the Lactobacillus/Enterococcus group in all 3 vessels of the system, representing the proximal, transverse and distal colonic areas. The prebiotic indices of the glucooligosaccharides were 2.29, 4.23 and 2.74 in V1, V2 and V3 respectively.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The development and performance of a three-stage tubular model of the large human intestine is outlined. Each stage comprises a membrane fermenter where flow of an aqueous polyethylene glycol solution on the outside of the tubular membrane is used to control the removal of water and metabolites (principally short chain fatty acids) from, and thus the pH of, the flowing contents on the fermenter side. The three stage system gave a fair representation of conditions in the human gut. Numbers of the main bacterial groups were consistently higher than in an existing three-chemostat gut model system, suggesting the advantages of the new design in providing an environment for bacterial growth to represent the actual colonic microflora. Concentrations of short chain fatty acids and Ph levels throughout the system were similar to those associated with corresponding sections of the human colon. The model was able to achieve considerable water transfer across the membrane, although the values were not as high as those in the colon. The model thus goes some way towards a realistic simulation of the colon, although it makes no pretence to simulate the pulsating nature of the real flow. The flow conditions in each section are characterized by low Reynolds numbers: mixing due to Taylor dispersion is significant, and the implications of Taylor mixing and biofilm development for the stability, that is the ability to operate without washout, of the system are briefly analysed and discussed. It is concluded that both phenomena are important for stabilizing the model and the human colon.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Isothermal titration microcalorimetry (ITC) has been applied to investigate protein-tannin interactions. Two hydrolyzable tannins were studied, namely myrabolan and tara tannins, for their interaction with bovine serum albumin (BSA), a model globular protein, and gelatin, a model proline-rich random coil protein. Calorimetry data indicate that protein-tannin interaction mechanisms are dependent upon the nature of the protein involved. Tannins apparently interact nonspecifically with the globular BSA, leading to binding saturation at estimated tannin/BSA molar ratios of 48:1 for tara- and 178:1 for myrabolan tannins. Tannins bind to the random coil protein gelatin by a two-stage mechanism. The energetics of the first stage show evidence for cooperative binding of tannins to the protein, while the second stage indicates gradual saturation of binding sites as observed for interaction with BSA. The structure and flexibility of the tannins themselves alters the stoichiometry of the interaction, but does not appear to have any significant affect on the overall binding mechanism observed. This study demonstrates the potential of ITC for providing an insight into the nature of protein-tannin interactions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The problem of state estimation occurs in many applications of fluid flow. For example, to produce a reliable weather forecast it is essential to find the best possible estimate of the true state of the atmosphere. To find this best estimate a nonlinear least squares problem has to be solved subject to dynamical system constraints. Usually this is solved iteratively by an approximate Gauss–Newton method where the underlying discrete linear system is in general unstable. In this paper we propose a new method for deriving low order approximations to the problem based on a recently developed model reduction method for unstable systems. To illustrate the theoretical results, numerical experiments are performed using a two-dimensional Eady model – a simple model of baroclinic instability, which is the dominant mechanism for the growth of storms at mid-latitudes. It is a suitable test model to show the benefit that may be obtained by using model reduction techniques to approximate unstable systems within the state estimation problem.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study examines differences in net selling price for residential real estate across male and female agents. A sample of 2,020 home sales transactions from Fulton County, Georgia are analyzed in a two-stage least squares, geospatial autoregressive corrected, semi-log hedonic model to test for gender and gender selection effects. Although agent gender seems to play a role in naïve models, its role becomes inconclusive as variables controlling for possible price and time on market expectations of the buyers and sellers are introduced to the models. Clear differences in real estate sales prices, time on market, and agent incomes across genders are unlikely due to differences in negotiation performance between genders or the mix of genders in a two-agent negotiation. The evidence suggests an interesting alternative to agent performance: that buyers and sellers with different reservation price and time on market expectations, such as those selling foreclosure homes, tend to select agents along gender lines.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We investigate the behavior of a single-cell protozoan in a narrow tubular ring. This environment forces them to swim under a one-dimensional periodic boundary condition. Above a critical density, single-cell protozoa aggregate spontaneously without external stimulation. The high-density zone of swimming cells exhibits a characteristic collective dynamics including translation and boundary fluctuation. We analyzed the velocity distribution and turn rate of swimming cells and found that the regulation of the turing rate leads to a stable aggregation and that acceleration of velocity triggers instability of aggregation. These two opposing effects may help to explain the spontaneous dynamics of collective behavior. We also propose a stochastic model for the mechanism underlying the collective behavior of swimming cells.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper proposes a method for describing the distribution of observed temperatures on any day of the year such that the distribution and summary statistics of interest derived from the distribution vary smoothly through the year. The method removes the noise inherent in calculating summary statistics directly from the data thus easing comparisons of distributions and summary statistics between different periods. The method is demonstrated using daily effective temperatures (DET) derived from observations of temperature and wind speed at De Bilt, Holland. Distributions and summary statistics are obtained from 1985 to 2009 and compared to the period 1904–1984. A two-stage process first obtains parameters of a theoretical probability distribution, in this case the generalized extreme value (GEV) distribution, which describes the distribution of DET on any day of the year. Second, linear models describe seasonal variation in the parameters. Model predictions provide parameters of the GEV distribution, and therefore summary statistics, that vary smoothly through the year. There is evidence of an increasing mean temperature, a decrease in the variability in temperatures mainly in the winter and more positive skew, more warm days, in the summer. In the winter, the 2% point, the value below which 2% of observations are expected to fall, has risen by 1.2 °C, in the summer the 98% point has risen by 0.8 °C. Medians have risen by 1.1 and 0.9 °C in winter and summer, respectively. The method can be used to describe distributions of future climate projections and other climate variables. Further extensions to the methodology are suggested.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In Sub-Saharan Africa (SSA) the technological advances of the Green Revolution (GR) have not been very successful. However, the efforts being made to re-introduce the revolution call for more socio-economic research into the adoption and the effects of the new technologies. The paper discusses an investigation on the effects of GR technology adoption on poverty among households in Ghana. Maximum likelihood estimation of a poverty model within the framework of Heckman's two stage method of correcting for sample selection was employed. Technology adoption was found to have positive effects in reducing poverty. Other factors that reduce poverty include education, credit, durable assets, living in the forest belt and in the south of the country. Technology adoption itself was also facilitated by education, credit, non-farm income and household labour supply as well as living in urban centres. Inarguably, technology adoption can be taken seriously by increasing the levels of complementary inputs such as credit, extension services and infrastructure. Above all, the fundamental problems of illiteracy, inequality and lack of effective markets must be addressed through increasing the levels of formal and non-formal education, equitable distribution of the 'national cake' and a more pragmatic management of the ongoing Structural Adjustment Programme.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A novel two-stage construction algorithm for linear-in-the-parameters classifier is proposed, aiming at noisy two-class classification problems. The purpose of the first stage is to produce a prefiltered signal that is used as the desired output for the second stage to construct a sparse linear-in-the-parameters classifier. For the first stage learning of generating the prefiltered signal, a two-level algorithm is introduced to maximise the model's generalisation capability, in which an elastic net model identification algorithm using singular value decomposition is employed at the lower level while the two regularisation parameters are selected by maximising the Bayesian evidence using a particle swarm optimization algorithm. Analysis is provided to demonstrate how “Occam's razor” is embodied in this approach. The second stage of sparse classifier construction is based on an orthogonal forward regression with the D-optimality algorithm. Extensive experimental results demonstrate that the proposed approach is effective and yields competitive results for noisy data sets.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A one-dimensional, thermodynamic, and radiative model of a melt pond on sea ice is presented that explicitly treats the melt pond as an extra phase. A two-stream radiation model, which allows albedo to be determined from bulk optical properties, and a parameterization of the summertime evolution of optical properties, is used. Heat transport within the sea ice is described using an equation describing heat transport in a mushy layer of a binary alloy (salt water). The model is tested by comparison of numerical simulations with SHEBA data and previous modeling. The presence of melt ponds on the sea ice surface is demonstrated to have a significant effect on the heat and mass balance. Sensitivity tests indicate that the maximum melt pond depth is highly sensitive to optical parameters and drainage. INDEX TERMS: 4207 Oceanography: General: Arctic and Antarctic oceanography; 4255 Oceanography: General: Numerical modeling; 4299 Oceanography: General: General or miscellaneous; KEYWORDS: sea ice, melt pond, albedo, Arctic Ocean, radiation model, thermodynamic

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a method for the recognition of complex actions. Our method combines automatic learning of simple actions and manual definition of complex actions in a single grammar. Contrary to the general trend in complex action recognition that consists in dividing recognition into two stages, our method performs recognition of simple and complex actions in a unified way. This is performed by encoding simple action HMMs within the stochastic grammar that models complex actions. This unified approach enables a more effective influence of the higher activity layers into the recognition of simple actions which leads to a substantial improvement in the classification of complex actions. We consider the recognition of complex actions based on person transits between areas in the scene. As input, our method receives crossings of tracks along a set of zones which are derived using unsupervised learning of the movement patterns of the objects in the scene. We evaluate our method on a large dataset showing normal, suspicious and threat behaviour on a parking lot. Experiments show an improvement of ~ 30% in the recognition of both high-level scenarios and their composing simple actions with respect to a two-stage approach. Experiments with synthetic noise simulating the most common tracking failures show that our method only experiences a limited decrease in performance when moderate amounts of noise are added.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The disadvantage of the majority of data assimilation schemes is the assumption that the conditional probability density function of the state of the system given the observations [posterior probability density function (PDF)] is distributed either locally or globally as a Gaussian. The advantage, however, is that through various different mechanisms they ensure initial conditions that are predominantly in linear balance and therefore spurious gravity wave generation is suppressed. The equivalent-weights particle filter is a data assimilation scheme that allows for a representation of a potentially multimodal posterior PDF. It does this via proposal densities that lead to extra terms being added to the model equations and means the advantage of the traditional data assimilation schemes, in generating predominantly balanced initial conditions, is no longer guaranteed. This paper looks in detail at the impact the equivalent-weights particle filter has on dynamical balance and gravity wave generation in a primitive equation model. The primary conclusions are that (i) provided the model error covariance matrix imposes geostrophic balance, then each additional term required by the equivalent-weights particle filter is also geostrophically balanced; (ii) the relaxation term required to ensure the particles are in the locality of the observations has little effect on gravity waves and actually induces a reduction in gravity wave energy if sufficiently large; and (iii) the equivalent-weights term, which leads to the particles having equivalent significance in the posterior PDF, produces a change in gravity wave energy comparable to the stochastic model error. Thus, the scheme does not produce significant spurious gravity wave energy and so has potential for application in real high-dimensional geophysical applications.