55 resultados para Filter cake


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The effects of several fat replacement levels (0%, 35%, 50%, 70%, and 100%) by inulin in sponge cake microstructure and physicochemical properties were studied. Oil substitution for inulin decreased significantly (P < 0.05) batter viscosity, giving heterogeneous bubbles size distributions as it was observed by light microscopy. Using confocal laser scanning microscopy the fat was observed to be located at the bubbles’ interface, enabling an optimum crumb cake structure development during baking. Cryo-SEM micrographs of cake crumbs showed a continuous matrix with embedded starch granules and coated with oil; when fat replacement levels increased, starch granules appeared as detached structures. Cakes with fat replacement up to 70% had a high crumb air cell values; they were softer and rated as acceptable by an untrained sensory panel (n = 51). So, the reformulation of a standard sponge cake recipe to obtain a new product with additional health benefits and accepted by consumers is achieved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The roles of some cake ingredients – oil, a leavening agent, and inulin – in the structure and physicochemical properties of batter and cakes were studied in four different formulations. Oil played an important role in the batter stability, due to its contribution to increasing batter viscosity and occluding air during mixing. The addition of the leavening agent was crucial to the final height and sponginess of the cakes. When inulin was used as a fat replacer, the absence of oil caused a decrease in the stability of the batter, where larger air bubbles were occluded. Inulin dispersed uniformly in the batter could create a competition for water with the flour components: gluten was not properly hydrated and some starch granules were not fully incorporated into the matrix. Thus, the development of a continuous network was disrupted and the cake was shorter and softer; it contained interconnected air cells in the crumb, and was easily crumbled. The structure studies were decisive to understand the physicochemical properties.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Sponge cakes have traditionally been manufactured using multistage mixing methods to enhance potential foam formation by the eggs. Today, use of all-in (single-stage) mixing methods is superseding multistage methods for large-scale batter preparation to reduce costs and production time. In this study, multistage and all-in mixing procedures and three final high-speed mixing times (3, 5, and 15 min) for sponge cake production were tested to optimize a mixing method for pilot-scale research. Mixing for 3 min produced batters with higher relative density values than did longer mixing times. These batters generated well-aerated cakes with high volume and low hardness. In contrast, after 5 and 15 min of high-speed mixing, batters with lower relative density and higher viscosity values were produced. Although higher bubble incorporation and retention were observed, longer mixing times produced better developed gluten networks, which stiffened the batters and inhibited bubble expansion during mixing. As a result, these batters did not expand properly and produced cakes with low volume, dense crumb, and high hardness values. Results for all-in mixing were similar to those for the multistage mixing procedure in terms of the physical properties of batters and cakes (i.e., relative density, elastic moduli, volume, total cell area, hardness, etc.). These results suggest the all-in mixing procedure with a final high-speed mixing time of 3 min is an appropriate mixing method for pilot-scale sponge cake production. The advantages of this method are reduced energy costs and production time.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper investigates the use of a particle filter for data assimilation with a full scale coupled ocean–atmosphere general circulation model. Synthetic twin experiments are performed to assess the performance of the equivalent weights filter in such a high-dimensional system. Artificial 2-dimensional sea surface temperature fields are used as observational data every day. Results are presented for different values of the free parameters in the method. Measures of the performance of the filter are root mean square errors, trajectories of individual variables in the model and rank histograms. Filter degeneracy is not observed and the performance of the filter is shown to depend on the ability to keep maximum spread in the ensemble.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A truly variance-minimizing filter is introduced and its per for mance is demonstrated with the Korteweg– DeV ries (KdV) equation and with a multilayer quasigeostrophic model of the ocean area around South Africa. It is recalled that Kalman-like filters are not variance minimizing for nonlinear model dynamics and that four - dimensional variational data assimilation (4DV AR)-like methods relying on per fect model dynamics have dif- ficulty with providing error estimates. The new method does not have these drawbacks. In fact, it combines advantages from both methods in that it does provide error estimates while automatically having balanced states after analysis, without extra computations. It is based on ensemble or Monte Carlo integrations to simulate the probability density of the model evolution. When obser vations are available, the so-called importance resampling algorithm is applied. From Bayes’ s theorem it follows that each ensemble member receives a new weight dependent on its ‘ ‘distance’ ’ t o the obser vations. Because the weights are strongly var ying, a resampling of the ensemble is necessar y. This resampling is done such that members with high weights are duplicated according to their weights, while low-weight members are largely ignored. In passing, it is noted that data assimilation is not an inverse problem by nature, although it can be for mulated that way . Also, it is shown that the posterior variance can be larger than the prior if the usual Gaussian framework is set aside. However , i n the examples presented here, the entropy of the probability densities is decreasing. The application to the ocean area around South Africa, gover ned by strongly nonlinear dynamics, shows that the method is working satisfactorily . The strong and weak points of the method are discussed and possible improvements are proposed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper discusses an important issue related to the implementation and interpretation of the analysis scheme in the ensemble Kalman filter . I t i s shown that the obser vations must be treated as random variables at the analysis steps. That is, one should add random perturbations with the correct statistics to the obser vations and generate an ensemble of obser vations that then is used in updating the ensemble of model states. T raditionally , this has not been done in previous applications of the ensemble Kalman filter and, as will be shown, this has resulted in an updated ensemble with a variance that is too low . This simple modification of the analysis scheme results in a completely consistent approach if the covariance of the ensemble of model states is interpreted as the prediction error covariance, and there are no further requirements on the ensemble Kalman filter method, except for the use of an ensemble of sufficient size. Thus, there is a unique correspondence between the error statistics from the ensemble Kalman filter and the standard Kalman filter approach

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The ring-shedding process in the Agulhas Current is studied using the ensemble Kalman filter to assimilate geosat altimeter data into a two-layer quasigeostrophic ocean model. The properties of the ensemble Kalman filter are further explored with focus on the analysis scheme and the use of gridded data. The Geosat data consist of 10 fields of gridded sea-surface height anomalies separated 10 days apart that are added to a climatic mean field. This corresponds to a huge number of data values, and a data reduction scheme must be applied to increase the efficiency of the analysis procedure. Further, it is illustrated how one can resolve the rank problem occurring when a too large dataset or a small ensemble is used.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Filter degeneracy is the main obstacle for the implementation of particle filter in non-linear high-dimensional models. A new scheme, the implicit equal-weights particle filter (IEWPF), is introduced. In this scheme samples are drawn implicitly from proposal densities with a different covariance for each particle, such that all particle weights are equal by construction. We test and explore the properties of the new scheme using a 1,000-dimensional simple linear model, and the 1,000-dimensional non-linear Lorenz96 model, and compare the performance of the scheme to a Local Ensemble Kalman Filter. The experiments show that the new scheme can easily be implemented in high-dimensional systems and is never degenerate, with good convergence properties in both systems.