20 resultados para Interdisciplinary methodology


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The objective of this study was to apply response surface methodology to estimate the emulsifying capacity and stability of mixtures containing isolated and textured soybean proteins combined with pectin and to evaluate if the extrusion process affects these interfacial properties. A simplex-centroid design was applied to the model emulsifying activity index (EAI), average droplet size (D-[4.3]) and creaming inhibition (Cl%) of the mixtures. All models were significant and able to explain more than 86% of the variation. The high predictive capacity of the models was also confirmed. The mean values for EAI, D-[4.3] and Cl% observed in all assays were 0.173 +/- 0.015 mn, 19.2 +/- 1.0 mu m and 53.3 +/- 2.6%, respectively. No synergism was observed between the three compounds. This result can be attributed to the low soybean protein solubility at pH 6.2 (<35%). Pectin was the most important variable for improving all responses. The emulsifying capacity of the mixture increased 41% after extrusion. Our results showed that pectin could substitute or improve the emulsifying properties of the soybean proteins and that the extrusion brings additional advantage to interfacial properties of this combination. (C) 2008 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

P>Vegetable oils can be extracted using ethanol as solvent. The main goal of this work was to evaluate the ethanol performance on the extraction process of rice bran oil. The influence of process variables, solvent hydration and temperature was evaluated using the response surface methodology, aiming to maximise the soluble substances and gamma-oryzanol transfer and minimise the free fatty acids extraction and the liquid content in the underflow solid. It can be noted that oil solubility in ethanol was highly affected by the water content. The free fatty acids extraction is improved by increasing the moisture content in the solvent. Regarding the gamma-oryzanol, it can be observed that its extraction is affected by temperature when low level of water is added to ethanol. On the other hand, the influence of temperature is minimised with high levels of water in the ethanol.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this work was to study the effect of the hydrolysis degree (HD) and the concentration (C(PVA)) Of two types of poly(vinyl alcohol) (PVA) and of the type (glycerol and sorbitol) and the concentration (C(P)) of plasticizers on some physical properties of biodegradable films based on blends of gelatin and PVA Using a response-surface methodology. The films were prepared with a film forming solutions (FFS) with 2 g of macromolecules (gelatin+PVA)/100 g de FFS. The responses analyzed were the mechanical properties, the solubility, the moisture Content. the color difference and the opacity. The linear model was statistically significant and predictive for puncture force and deformation. elongation at break, solubility in water, Moisture content and opacity. The CPVA affected strongly the elongation at break of the films. The interaction of the HD and the C(P) affected this property. Moreover. the puncture force was affected slightly by the C(PVA). Concerning the Solubility in water, the reduction of the HD increased it and this effect was greater for high CPVA Values. In general. the most important effect observed in the physical properties of the films was that of the plasticizer type and concentration. The PVA hydrolysis degree and concentration have an important effect only for the elongation at break, puncture deformation and solubility in water. (C) 2008 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we propose a new lifetime distribution which can handle bathtub-shaped unimodal increasing and decreasing hazard rate functions The model has three parameters and generalizes the exponential power distribution proposed by Smith and Bain (1975) with the inclusion of an additional shape parameter The maximum likelihood estimation procedure is discussed A small-scale simulation study examines the performance of the likelihood ratio statistics under small and moderate sized samples Three real datasets Illustrate the methodology (C) 2010 Elsevier B V All rights reserved

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this article we address decomposition strategies especially tailored to perform strong coupling of dimensionally heterogeneous models, under the hypothesis that one wants to solve each submodel separately and implement the interaction between subdomains by boundary conditions alone. The novel methodology takes full advantage of the small number of interface unknowns in this kind of problems. Existing algorithms can be viewed as variants of the `natural` staggered algorithm in which each domain transfers function values to the other, and receives fluxes (or forces), and vice versa. This natural algorithm is known as Dirichlet-to-Neumann in the Domain Decomposition literature. Essentially, we propose a framework in which this algorithm is equivalent to applying Gauss-Seidel iterations to a suitably defined (linear or nonlinear) system of equations. It is then immediate to switch to other iterative solvers such as GMRES or other Krylov-based method. which we assess through numerical experiments showing the significant gain that can be achieved. indeed. the benefit is that an extremely flexible, automatic coupling strategy can be developed, which in addition leads to iterative procedures that are parameter-free and rapidly converging. Further, in linear problems they have the finite termination property. Copyright (C) 2009 John Wiley & Sons, Ltd.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we proposed a new two-parameter lifetime distribution with increasing failure rate, the complementary exponential geometric distribution, which is complementary to the exponential geometric model proposed by Adamidis and Loukas (1998). The new distribution arises on a latent complementary risks scenario, in which the lifetime associated with a particular risk is not observable; rather, we observe only the maximum lifetime value among all risks. The properties of the proposed distribution are discussed, including a formal proof of its probability density function and explicit algebraic formulas for its reliability and failure rate functions, moments, including the mean and variance, variation coefficient, and modal value. The parameter estimation is based on the usual maximum likelihood approach. We report the results of a misspecification simulation study performed in order to assess the extent of misspecification errors when testing the exponential geometric distribution against our complementary one in the presence of different sample size and censoring percentage. The methodology is illustrated on four real datasets; we also make a comparison between both modeling approaches. (C) 2011 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose - The purpose of this paper is to develop a novel unstructured simulation approach for injection molding processes described by the Hele-Shaw model. Design/methodology/approach - The scheme involves dual dynamic meshes with active and inactive cells determined from an initial background pointset. The quasi-static pressure solution in each timestep for this evolving unstructured mesh system is approximated using a control volume finite element method formulation coupled to a corresponding modified volume of fluid method. The flow is considered to be isothermal and non-Newtonian. Findings - Supporting numerical tests and performance studies for polystyrene described by Carreau, Cross, Ellis and Power-law fluid models are conducted. Results for the present method are shown to be comparable to those from other methods for both Newtonian fluid and polystyrene fluid injected in different mold geometries. Research limitations/implications - With respect to the methodology, the background pointset infers a mesh that is dynamically reconstructed here, and there are a number of efficiency issues and improvements that would be relevant to industrial applications. For instance, one can use the pointset to construct special bases and invoke a so-called ""meshless"" scheme using the basis. This would require some interesting strategies to deal with the dynamic point enrichment of the moving front that could benefit from the present front treatment strategy. There are also issues related to mass conservation and fill-time errors that might be addressed by introducing suitable projections. The general question of ""rate of convergence"" of these schemes requires analysis. Numerical results here suggest first-order accuracy and are consistent with the approximations made, but theoretical results are not available yet for these methods. Originality/value - This novel unstructured simulation approach involves dual meshes with active and inactive cells determined from an initial background pointset: local active dual patches are constructed ""on-the-fly"" for each ""active point"" to form a dynamic virtual mesh of active elements that evolves with the moving interface.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A new method for characterization and analysis of asphaltic mixtures aggregate particles is reported. By relying on multiscale representation of the particles, curvature estimation, and discriminant analysis for optimal separation of the categories of mixtures, a particularly effective and comprehensive methodology is obtained. The potential of the methodology is illustrated with respect to three important types of particles used in asphaltic mixtures, namely basalt, gabbro, and gravel. The obtained results show that gravel particles are markedly distinct from the other two types of particles, with the gabbro category resulting with intermediate geometrical properties. The importance of each considered measurement in the discrimination between the three categories of particles was also quantified in terms of the adopted discriminant analysis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Alzheimer`s disease is an ultimately fatal neurodegenerative disease, and BACE-1 has become an attractive validated target for its therapy, with more than a hundred crystal structures deposited in the PDB. In the present study, we present a new methodology that integrates ligand-based methods with structural information derived from the receptor. 128 BACE-1 inhibitors recently disclosed by GlaxoSmithKline R&D were selected specifically because the crystal structures of 9 of these compounds complexed to BACE-1, as well as five closely related analogs, have been made available. A new fragment-guided approach was designed to incorporate this wealth of structural information into a CoMFA study, and the methodology was systematically compared to other popular approaches, such as docking, for generating a molecular alignment. The influence of the partial charges calculation method was also analyzed. Several consistent and predictive models are reported, including one with r (2) = 0.88, q (2) = 0.69 and r (pred) (2) = 0.72. The models obtained with the new methodology performed consistently better than those obtained by other methodologies, particularly in terms of external predictive power. The visual analyses of the contour maps in the context of the enzyme drew attention to a number of possible opportunities for the development of analogs with improved potency. These results suggest that 3D-QSAR studies may benefit from the additional structural information added by the presented methodology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we present a Bayesian approach for estimation in the skew-normal calibration model, as well as the conditional posterior distributions which are useful for implementing the Gibbs sampler. Data transformation is thus avoided by using the methodology proposed. Model fitting is implemented by proposing the asymmetric deviance information criterion, ADIC, a modification of the ordinary DIC. We also report an application of the model studied by using a real data set, related to the relationship between the resistance and the elasticity of a sample of concrete beams. Copyright (C) 2008 John Wiley & Sons, Ltd.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The generalized Birnbaum-Saunders (GBS) distribution is a new class of positively skewed models with lighter and heavier tails than the traditional Birnbaum-Saunders (BS) distribution, which is largely applied to study lifetimes. However, the theoretical argument and the interesting properties of the GBS model have made its application possible beyond the lifetime analysis. The aim of this paper is to present the GBS distribution as a useful model for describing pollution data and deriving its positive and negative moments. Based on these moments, we develop estimation and goodness-of-fit methods. Also, some properties of the proposed estimators useful for developing asymptotic inference are presented. Finally, an application with real data from Environmental Sciences is given to illustrate the methodology developed. This example shows that the empirical fit of the GBS distribution to the data is very good. Thus, the GBS model is appropriate for describing air pollutant concentration data, which produces better results than the lognormal model when the administrative target is determined for abating air pollution. Copyright (c) 2007 John Wiley & Sons, Ltd.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The immersed boundary method is a versatile tool for the investigation of flow-structure interaction. In a large number of applications, the immersed boundaries or structures are very stiff and strong tangential forces on these interfaces induce a well-known, severe time-step restriction for explicit discretizations. This excessive stability constraint can be removed with fully implicit or suitable semi-implicit schemes but at a seemingly prohibitive computational cost. While economical alternatives have been proposed recently for some special cases, there is a practical need for a computationally efficient approach that can be applied more broadly. In this context, we revisit a robust semi-implicit discretization introduced by Peskin in the late 1970s which has received renewed attention recently. This discretization, in which the spreading and interpolation operators are lagged. leads to a linear system of equations for the inter-face configuration at the future time, when the interfacial force is linear. However, this linear system is large and dense and thus it is challenging to streamline its solution. Moreover, while the same linear system or one of similar structure could potentially be used in Newton-type iterations, nonlinear and highly stiff immersed structures pose additional challenges to iterative methods. In this work, we address these problems and propose cost-effective computational strategies for solving Peskin`s lagged-operators type of discretization. We do this by first constructing a sufficiently accurate approximation to the system`s matrix and we obtain a rigorous estimate for this approximation. This matrix is expeditiously computed by using a combination of pre-calculated values and interpolation. The availability of a matrix allows for more efficient matrix-vector products and facilitates the design of effective iterative schemes. We propose efficient iterative approaches to deal with both linear and nonlinear interfacial forces and simple or complex immersed structures with tethered or untethered points. One of these iterative approaches employs a splitting in which we first solve a linear problem for the interfacial force and then we use a nonlinear iteration to find the interface configuration corresponding to this force. We demonstrate that the proposed approach is several orders of magnitude more efficient than the standard explicit method. In addition to considering the standard elliptical drop test case, we show both the robustness and efficacy of the proposed methodology with a 2D model of a heart valve. (C) 2009 Elsevier Inc. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present an efficient numerical methodology for the 31) computation of incompressible multi-phase flows described by conservative phase-field models We focus here on the case of density matched fluids with different viscosity (Model H) The numerical method employs adaptive mesh refinements (AMR) in concert with an efficient semi-implicit time discretization strategy and a linear, multi-level multigrid to relax high order stability constraints and to capture the flow`s disparate scales at optimal cost. Only five linear solvers are needed per time-step. Moreover, all the adaptive methodology is constructed from scratch to allow a systematic investigation of the key aspects of AMR in a conservative, phase-field setting. We validate the method and demonstrate its capabilities and efficacy with important examples of drop deformation, Kelvin-Helmholtz instability, and flow-induced drop coalescence (C) 2010 Elsevier Inc. All rights reserved

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Allyl 1-naphthyl ethers are useful compounds for different purposes, but reported methods to synthesize them require long reaction times. In this work, we have obtained allyl 1-naphthyl ether in good yield using ultrasonic-assisted methodology in a 1-h reaction. A central composite design was used to obtain a statistical model and a response surface (p < 0.05; R(2) = 0.970; R(adj)(2) = 0.949; R(pred)(2) = 0.818) that can predict the optimal conditions to maximize the yield, validated experimentally. (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Optimization of photo-Fenton degradation of copper phthalocyanine blue was achieved by response surface methodology (RSM) constructed with the aid of a sequential injection analysis (SIA) system coupled to a homemade photo-reactor. Highest degradation percentage was obtained at the following conditions [H(2)O(2)]/[phthalocyanine] = 7, [H(2)O(2)]/[FeSO(4)] = 10, pH = 2.5, and stopped flow time in the photo reactor = 30 s. The SIA system was designed to prepare a monosegment containing the reagents and sample, to pump it toward the photo-reactor for the specified time and send the products to a flow-through spectrophotometer for monitoring the color reduction of the dye. Changes in parameters such as reagent molar ratios. residence time and pH were made by modifications in the software commanding the SI system, without the need for physical reconfiguration of reagents around the selection valve. The proposed procedure and system fed the statistical program with degradation data for fast construction of response surface plots. After optimization, 97% of the dye was degraded. (C) 2009 Elsevier B.V. All rights reserved.