946 resultados para Numerical error


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a method for estimating the posterior probability density of the cointegrating rank of a multivariate error correction model. A second contribution is the careful elicitation of the prior for the cointegrating vectors derived from a prior on the cointegrating space. This prior obtains naturally from treating the cointegrating space as the parameter of interest in inference and overcomes problems previously encountered in Bayesian cointegration analysis. Using this new prior and Laplace approximation, an estimator for the posterior probability of the rank is given. The approach performs well compared with information criteria in Monte Carlo experiments. (C) 2003 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The image reconstruction using the EIT (Electrical Impedance Tomography) technique is a nonlinear and ill-posed inverse problem which demands a powerful direct or iterative method. A typical approach for solving the problem is to minimize an error functional using an iterative method. In this case, an initial solution close enough to the global minimum is mandatory to ensure the convergence to the correct minimum in an appropriate time interval. The aim of this paper is to present a new, simple and low cost technique (quadrant-searching) to reduce the search space and consequently to obtain an initial solution of the inverse problem of EIT. This technique calculates the error functional for four different contrast distributions placing a large prospective inclusion in the four quadrants of the domain. Comparing the four values of the error functional it is possible to get conclusions about the internal electric contrast. For this purpose, initially we performed tests to assess the accuracy of the BEM (Boundary Element Method) when applied to the direct problem of the EIT and to verify the behavior of error functional surface in the search space. Finally, numerical tests have been performed to verify the new technique.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The solidification of intruded magma in porous rocks can result in the following two consequences: (1) the heat release due to the solidification of the interface between the rock and intruded magma and (2) the mass release of the volatile fluids in the region where the intruded magma is solidified into the rock. Traditionally, the intruded magma solidification problem is treated as a moving interface (i.e. the solidification interface between the rock and intruded magma) problem to consider these consequences in conventional numerical methods. This paper presents an alternative new approach to simulate thermal and chemical consequences/effects of magma intrusion in geological systems, which are composed of porous rocks. In the proposed new approach and algorithm, the original magma solidification problem with a moving boundary between the rock and intruded magma is transformed into a new problem without the moving boundary but with the proposed mass source and physically equivalent heat source. The major advantage in using the proposed equivalent algorithm is that a fixed mesh of finite elements with a variable integration time-step can be employed to simulate the consequences and effects of the intruded magma solidification using the conventional finite element method. The correctness and usefulness of the proposed equivalent algorithm have been demonstrated by a benchmark magma solidification problem. Copyright (c) 2005 John Wiley & Sons, Ltd.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Numerical methods are used to simulate the double-diffusion driven convective pore-fluid flow and rock alteration in three-dimensional fluid-saturated geological fault zones. The double diffusion is caused by a combination of both the positive upward temperature gradient and the positive downward salinity concentration gradient within a three-dimensional fluid-saturated geological fault zone, which is assumed to be more permeable than its surrounding rocks. In order to ensure the physical meaningfulness of the obtained numerical solutions, the numerical method used in this study is validated by a benchmark problem, for which the analytical solution to the critical Rayleigh number of the system is available. The theoretical value of the critical Rayleigh number of a three-dimensional fluid-saturated geological fault zone system can be used to judge whether or not the double-diffusion driven convective pore-fluid flow can take place within the system. After the possibility of triggering the double-diffusion driven convective pore-fluid flow is theoretically validated for the numerical model of a three-dimensional fluid-saturated geological fault zone system, the corresponding numerical solutions for the convective flow and temperature are directly coupled with a geochemical system. Through the numerical simulation of the coupled system between the convective fluid flow, heat transfer, mass transport and chemical reactions, we have investigated the effect of the double-diffusion driven convective pore-fluid flow on the rock alteration, which is the direct consequence of mineral redistribution due to its dissolution, transportation and precipitation, within the three-dimensional fluid-saturated geological fault zone system. (c) 2005 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A comprehensive probabilistic model for simulating dendrite morphology and investigating dendritic growth kinetics during solidification has been developed, based on a modified Cellular Automaton (mCA) for microscopic modeling of nucleation, growth of crystals and solute diffusion. The mCA model numerically calculated solute redistribution both in the solid and liquid phases, the curvature of dendrite tips and the growth anisotropy. This modeling takes account of thermal, curvature and solute diffusion effects. Therefore, it can simulate microstructure formation both on the scale of the dendrite tip length. This model was then applied for simulating dendritic solidification of an Al-7%Si alloy. Both directional and equiaxed dendritic growth has been performed to investigate the growth anisotropy and cooling rate on dendrite morphology. Furthermore, the competitive growth and selection of dendritic crystals have also investigated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nanocomposite materials have received considerable attention in recent years due to their novel properties. Grain boundaries are considered to play an important role in nanostructured materials. This work focuses on the finite element analysis of the effect of grain boundaries on the overall mechanical properties of aluminium/alumina composites. A grain boundary is incorporated into the commonly used unit cell model to investigate its effect on material properties. By combining the unit cell model with an indentation model, coupled with experimental indentation measurements, the ''effective'' plastic property of the grain boundary is estimated. In addition, the strengthening mechanism is also discussed based on the Estrin-Mecking model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Field studies have shown that the elevation of the beach groundwater table varies with the tide and such variations affect significantly beach erosion or accretion. In this paper, we present a BEM (Boundary Element Method) model for simulating the tidal fluctuation of the beach groundwater table. The model solves the two-dimensional flow equation subject to free and moving boundary conditions, including the seepage dynamics at the beach face. The simulated seepage faces were found to agree with the predictions of a simple model (Turner, 1993). The advantage of the present model is, however, that it can be used with little modification to simulate more complicated cases, e.g., surface recharge from rainfall and drainage in the aquifer may be included (the latter is related to beach dewatering technique). The model also simulated well the field data of Nielsen (1990). In particular, the model replicated three distinct features of local water table fluctuations: steep rising phase versus flat falling phase, amplitude attenuation and phase lagging.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

High-pressure homogenization is a key unit operation used to disrupt cells containing intracellular bioproducts. Modeling and optimization of this unit are restrained by a lack of information on the flow conditions within a homogenizer value. A numerical investigation of the impinging radial jet within a homogenizer value is presented. Results for a laminar and turbulent (k-epsilon turbulent model) jet are obtained using the PHOENICS finite-volume code. Experimental measurement of the stagnation region width and correlation of the cell disruption efficiency with jet stagnation pressure both indicate that the impinging jet in the homogenizer system examined is likely to be laminar under normal operating conditions. Correlation of disruption data with laminar stagnation pressure provides a better description of experimental variability than existing correlations using total pressure drop or the grouping 1/Y(2)h(2).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Analysis of a major multi-site epidemiologic study of heart disease has required estimation of the pairwise correlation of several measurements across sub-populations. Because the measurements from each sub-population were subject to sampling variability, the Pearson product moment estimator of these correlations produces biased estimates. This paper proposes a model that takes into account within and between sub-population variation, provides algorithms for obtaining maximum likelihood estimates of these correlations and discusses several approaches for obtaining interval estimates. (C) 1997 by John Wiley & Sons, Ltd.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Biochemical analysis of fluid is the primary laboratory approach hi pleural effusion diagnosis. Standardization of the steps between collection and laboratorial analyses are fundamental to maintain the quality of the results. We evaluated the influence of temperature and storage time on sample stability. Methods: Pleural fluid from 30 patients was submitted to analyses of proteins, albumin, lactic dehydrogenase (LDH), cholesterol, triglycerides, and glucose. Aliquots were stored at 21 degrees, 4 degrees, and-20 degrees C, and concentrations were determined after 1, 2, 3, 4, 7, and 14 days. LDH isoenzymes were quantified in 7 random samples. Results: Due to the instability of isoenzymes 4 and 5, a decrease in LDH was observed in the first 24 h in samples maintained at -20 degrees C and after 2 days when maintained at 4 degrees C. Aside from glucose, all parameters were stable for up to at least day 4 when stored at room temperature or 4 degrees C. Conclusions: Temperature and storage time are potential preanalytical errors in pleural fluid analyses, mainly if we consider the instability of glucose and LDH. The ideal procedure is to execute all the tests immediately after collection. However, most of the tests can be done in refrigerated sample;, excepting LDH analysis. (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Parenteral anticoagulation is a cornerstone in the management of venous and arterial thrombosis. Unfractionated heparin has a wide dose/response relationship, requiring frequent and troublesome laboratorial follow-up. Because of all these factors, low-molecular-weight heparin use has been increasing. Inadequate dosage has been pointed out as a potential problem because the use of subjectively estimated weight instead of real measured weight is common practice in the emergency department (ED). To evaluate the impact of inadequate weight estimation on enoxaparin dosage, we investigated the adequacy of anticoagulation of patients in a tertiary ED where subjective weight estimation is common practice. We obtained the estimated, informed, and measured weight of 28 patients in need of parenteral anticoagulation. Basal and steady-state (after the second subcutaneous shot of enoxaparin) anti-Xa activity was obtained as a measure of adequate anticoagulation. The patients were divided into 2 groups according the anticoagulation adequacy. From the 28 patients enrolled, 75% (group 1, n = 21) received at least 0.9 mg/kg per dose BID and 25% (group 2, n = 7) received less than 0.9 mg/kg per dose BID of enoxaparin. Only 4 (14.3%) of all patients had anti-Xa activity less than the inferior limit of the therapeutic range (<0.5 UI/mL), all of them from group 2. In conclusion, when weight estimation was used to determine the enoxaparin dosage, 25% of the patients were inadequately anticoagulated (anti-Xa activity <0.5 UI/mL) during the initial crucial phase of treatment. (C) 2011 Elsevier Inc. All rights reserved.