960 resultados para Improved sequential algebraic algorithm
Resumo:
An improved flow-based procedure is proposed for turbidimetric sulphate determination in waters. The flow system was designed with solenoid micro-pumps in order to improve mixing conditions and minimize reagent consumption as well as waste generation. Stable baselines were observed in view of the pulsed flow characteristic of the systems designed with solenoid micro-pumps, thus making the use of washing solutions unnecessary. The nucleation process was improved by stopping the flow prior to the measurement, thus avoiding the need of sulphate addition. When a 1-cm optical path flow cell was employed, linear response was achieved within 20-200 mg L(-1), described by the equation S = -0.0767 + 0.00438C (mg L(-1)), r = 0.999. The detection limit was estimated as 3 mg L(-1) at the 99.7% confidence level and the coefficient of variation was 2.4% (n = 20). The sampling rate was estimated as 33 determinations per hour. A long pathlength (100-cm) flow cell based on a liquid core waveguide was exploited to increase sensitivity in turbidimetry. Baseline drifts were avoided by a periodical washing step with EDTA in alkaline medium. Linear response was observed within 7-16 mg L(-1), described by the equation S = -0.865 + 0.132C (mg L(-1)), r = 0.999. The detection limit was estimated as 150 mu g L(-1) at the 99.7% confidence level and the coefficient of variation was 3.0% (n = 20). The sampling rate was estimated as 25 determinations per hour. The results obtained for freshwater and rain water samples were in agreement with those achieved by batch turbidimetry at the 95% confidence level. (C) 2008 Elsevier B.V All rights reserved.
Resumo:
A novel flow-based strategy for implementing simultaneous determinations of different chemical species reacting with the same reagent(s) at different rates is proposed and applied to the spectrophotometric catalytic determination of iron and vanadium in Fe-V alloys. The method relies on the influence of Fe(II) and V(IV) on the rate of the iodide oxidation by Cr(VI) under acidic conditions, the Jones reducing agent is then needed Three different plugs of the sample are sequentially inserted into an acidic KI reagent carrier stream, and a confluent Cr(VI) solution is added downstream Overlap between the inserted plugs leads to a complex sample zone with several regions of maximal and minimal absorbance values. Measurements performed on these regions reveal the different degrees of reaction development and tend to be more precise Data are treated by multivariate calibration involving the PLS algorithm The proposed system is very simple and rugged Two latent variables carried out ca 95% of the analytical information and the results are in agreement with ICP-OES. (C) 2010 Elsevier B V. All rights reserved.
Resumo:
An improvement to the quality bidimensional Delaunay mesh generation algorithm, which combines the mesh refinement algorithms strategy of Ruppert and Shewchuk is proposed in this research. The developed technique uses diametral lenses criterion, introduced by L. P. Chew, with the purpose of eliminating the extremely obtuse triangles in the boundary mesh. This method splits the boundary segment and obtains an initial prerefinement, and thus reducing the number of necessary iterations to generate a high quality sequential triangulation. Moreover, it decreases the intensity of the communication and synchronization between subdomains in parallel mesh refinement.
Resumo:
A fully conserving algorithm is developed in this paper for the integration of the equations of motion in nonlinear rod dynamics. The starting point is a re-parameterization of the rotation field in terms of the so-called Rodrigues rotation vector, which results in an extremely simple update of the rotational variables. The weak form is constructed with a non-orthogonal projection corresponding to the application of the virtual power theorem. Together with an appropriate time-collocation, it ensures exact conservation of momentum and total energy in the absence of external forces. Appealing is the fact that nonlinear hyperelastic materials (and not only materials with quadratic potentials) are permitted without any prejudice on the conservation properties. Spatial discretization is performed via the finite element method and the performance of the scheme is assessed by means of several numerical simulations.
Resumo:
Electrical impedance tomography (EIT) captures images of internal features of a body. Electrodes are attached to the boundary of the body, low intensity alternating currents are applied, and the resulting electric potentials are measured. Then, based on the measurements, an estimation algorithm obtains the three-dimensional internal admittivity distribution that corresponds to the image. One of the main goals of medical EIT is to achieve high resolution and an accurate result at low computational cost. However, when the finite element method (FEM) is employed and the corresponding mesh is refined to increase resolution and accuracy, the computational cost increases substantially, especially in the estimation of absolute admittivity distributions. Therefore, we consider in this work a fast iterative solver for the forward problem, which was previously reported in the context of structural optimization. We propose several improvements to this solver to increase its performance in the EIT context. The solver is based on the recycling of approximate invariant subspaces, and it is applied to reduce the EIT computation time for a constant and high resolution finite element mesh. In addition, we consider a powerful preconditioner and provide a detailed pseudocode for the improved iterative solver. The numerical results show the effectiveness of our approach: the proposed algorithm is faster than the preconditioned conjugate gradient (CG) algorithm. The results also show that even on a standard PC without parallelization, a high mesh resolution (more than 150,000 degrees of freedom) can be used for image estimation at a relatively low computational cost. (C) 2010 Elsevier B.V. All rights reserved.
Distributed Estimation Over an Adaptive Incremental Network Based on the Affine Projection Algorithm
Resumo:
We study the problem of distributed estimation based on the affine projection algorithm (APA), which is developed from Newton`s method for minimizing a cost function. The proposed solution is formulated to ameliorate the limited convergence properties of least-mean-square (LMS) type distributed adaptive filters with colored inputs. The analysis of transient and steady-state performances at each individual node within the network is developed by using a weighted spatial-temporal energy conservation relation and confirmed by computer simulations. The simulation results also verify that the proposed algorithm provides not only a faster convergence rate but also an improved steady-state performance as compared to an LMS-based scheme. In addition, the new approach attains an acceptable misadjustment performance with lower computational and memory cost, provided the number of regressor vectors and filter length parameters are appropriately chosen, as compared to a distributed recursive-least-squares (RLS) based method.
Resumo:
This work discusses a 4D lung reconstruction method from unsynchronized MR sequential images. The lung, differently from the heart, does not have its own muscles, turning impossible to see its real movements. The visualization of the lung in motion is an actual topic of research in medicine. CT (Computerized Tomography) can obtain spatio-temporal images of the heart by synchronizing with electrocardiographic waves. The FOV of the heart is small when compared to the lung`s FOV. The lung`s movement is not periodic and is susceptible to variations in the degree of respiration. Compared to CT, MR (Magnetic Resonance) imaging involves longer acquisition times and it is not possible to obtain instantaneous 3D images of the lung. For each slice, only one temporal sequence of 2D images can be obtained. However, methods using MR are preferable because they do not involve radiation. In this paper, based on unsynchronized MR images of the lung an animated B-Repsolid model of the lung is created. The 3D animation represents the lung`s motion associated to one selected sequence of MR images. The proposed method can be divided in two parts. First, the lung`s silhouettes moving in time are extracted by detecting the presence of a respiratory pattern on 2D spatio-temporal MR images. This approach enables us to determine the lung`s silhouette for every frame, even on frames with obscure edges. The sequence of extracted lung`s silhouettes are unsynchronized sagittal and coronal silhouettes. Using our algorithm it is possible to reconstruct a 3D lung starting from a silhouette of any type (coronal or sagittal) selected from any instant in time. A wire-frame model of the lung is created by composing coronal and sagittal planar silhouettes representing cross-sections. The silhouette composition is severely underconstrained. Many wire-frame models can be created from the observed sequences of silhouettes in time. Finally, a B-Rep solid model is created using a meshing algorithm. Using the B-Rep solid model the volume in time for the right and left lungs were calculated. It was possible to recognize several characteristics of the 3D real right and left lungs in the shaded model. (C) 2007 Elsevier Ltd. All rights reserved.
Resumo:
OBJECTIVE: To evaluate a diagnostic algorithm for pulmonary tuberculosis based on smear microscopy and objective response to trial of antibiotics. SETTING: Adult medical wards, Hlabisa Hospital, South Africa, 1996-1997. METHODS: Adults with chronic chest symptoms and abnormal chest X-ray had sputum examined for Ziehl-Neelsen stained acid-fast bacilli by light microscopy. Those with negative smears were treated with amoxycillin for 5 days and assessed. Those who had not improved were treated with erythromycin for 5 days and reassessed. Response was compared with mycobacterial culture. RESULTS: Of 280 suspects who completed the diagnostic pathway, 160 (57%) had a positive smear, 46 (17%) responded to amoxycillin, 34 (12%) responded to erythromycin and 40 (14%) were treated as smear-negative tuberculosis. The sensitivity (89%) and specificity (84%) of the full algorithm for culture-positive tuberculosis were high. However, 11 patients (positive predictive value [PPV] 95%) were incorrectly diagnosed with tuberculosis, and 24 cases of tuberculosis (negative predictive value [NPV] 70%) were not identified. NPV improved to 75% when anaemia was included as a predictor. Algorithm performance was independent of human immunodeficiency virus status. CONCLUSION: Sputum smear microscopy plus trial of antibiotic algorithm among a selected group of tuberculosis suspects may increase diagnostic accuracy in district hospitals in developing countries.
Resumo:
Aims Trials of disease management programmes (DMP) in heart failure (HF) have shown controversial results regarding quality of life. We hypothesized that a DMP applied over the long-term could produce different effects on each of the quality-of-life components. Methods and results We extended the prospective, randomized REMADHE Trial, which studied a DMP in HF patients. We analysed changes in Minnesota Living with Heart Failure Questionnaire components in 412 patients, 60.5% male, age 50.2 +/- 11.4 years, left ventricular ejection fraction 34.7 +/- 10.5%. During a mean follow-up of 3.6 +/- 2.2 years, 6.3% of patients underwent heart transplantation and 31.8% died. Global quality-of-life scores improved in the DMP intervention group, compared with controls, respectively: 57.5 +/- 3.1 vs. 52.6 +/- 4.3 at baseline, 32.7 +/- 3.9 vs. 40.2 +/- 6.3 at 6 months, 31.9 +/- 4.3 vs. 41.5 +/- 7.4 at 12 months, 26.8 +/- 3.1 vs. 47.0 +/- 5.3 at the final assessment; P<0.01. Similarly, the physical component (23.7 +/- 1.4 vs. 21.1 +/- 2.2 at baseline, 16.2 +/- 2.9 vs. 18.0 +/- 3.3 at 6 months, 17.3 +/- 2.9 vs. 23.1 +/- 5.7 at 12 months, 11.4 +/- 1.6 vs. 19.9 +/- 2.4 final; P<0.01), the emotional component (13.2 +/- 1.0 vs. 12.1 +/- 1.4 at baseline, 11.7 +/- 2.7 vs. 12.3 +/- 3.1 at 6 months, 12.4 +/- 2.9 vs. 16.8 +/- 5.9 at 12 months, 6.7 +/- 1.0 vs. 10.6 +/- 1.4 final; P<0.01) and the additional questions (20.8 +/- 1.2 vs. 19.3 +/- 1.8 at baseline, 14.3 +/- 2.7 vs. 17.3 +/- 3.1 at 6 months, 12.4 +/- 2.9 vs. 21.0 +/- 5.5 at 12 months, 6.7 +/- 1.4 vs. 17.3 +/- 2.2 final; P<0.01) were better (lower) in the intervention group. The emotional component improved earlier than the others. Post-randomization quality of life was not associated with events. Conclusion Components of the quality-of-life assessment responded differently to DMP. These results indicate the need for individualized DMP strategies in patients with HF. Trial registration information www.clincaltrials.gov NCT00505050-REMADHE.
Resumo:
The popular Newmark algorithm, used for implicit direct integration of structural dynamics, is extended by means of a nodal partition to permit use of different timesteps in different regions of a structural model. The algorithm developed has as a special case an explicit-explicit subcycling algorithm previously reported by Belytschko, Yen and Mullen. That algorithm has been shown, in the absence of damping or other energy dissipation, to exhibit instability over narrow timestep ranges that become narrower as the number of degrees of freedom increases, making them unlikely to be encountered in practice. The present algorithm avoids such instabilities in the case of a one to two timestep ratio (two subcycles), achieving unconditional stability in an exponential sense for a linear problem. However, with three or more subcycles, the trapezoidal rule exhibits stability that becomes conditional, falling towards that of the central difference method as the number of subcycles increases. Instabilities over narrow timestep ranges, that become narrower as the model size increases, also appear with three or more subcycles. However by moving the partition between timesteps one row of elements into the region suitable for integration with the larger timestep these the unstable timestep ranges become extremely narrow, even in simple systems with a few degrees of freedom. As well, accuracy is improved. Use of a version of the Newmark algorithm that dissipates high frequencies minimises or eliminates these narrow bands of instability. Viscous damping is also shown to remove these instabilities, at the expense of having more effect on the low frequency response.
Resumo:
The recent changes concerning the consumers’ active participation in the efficient management of load devices for one’s own interest and for the interest of the network operator, namely in the context of demand response, leads to the need for improved algorithms and tools. A continuous consumption optimization algorithm has been improved in order to better manage the shifted demand. It has been done in a simulation and user-interaction tool capable of being integrated in a multi-agent smart grid simulator already developed, and also capable of integrating several optimization algorithms to manage real and simulated loads. The case study of this paper enhances the advantages of the proposed algorithm and the benefits of using the developed simulation and user interaction tool.
Resumo:
Dissertação para obtenção do Grau de Mestre em Engenharia Informática
Resumo:
In this paper we present the operational matrices of the left Caputo fractional derivative, right Caputo fractional derivative and Riemann–Liouville fractional integral for shifted Legendre polynomials. We develop an accurate numerical algorithm to solve the two-sided space–time fractional advection–dispersion equation (FADE) based on a spectral shifted Legendre tau (SLT) method in combination with the derived shifted Legendre operational matrices. The fractional derivatives are described in the Caputo sense. We propose a spectral SLT method, both in temporal and spatial discretizations for the two-sided space–time FADE. This technique reduces the two-sided space–time FADE to a system of algebraic equations that simplifies the problem. Numerical results carried out to confirm the spectral accuracy and efficiency of the proposed algorithm. By selecting relatively few Legendre polynomial degrees, we are able to get very accurate approximations, demonstrating the utility of the new approach over other numerical methods.
Resumo:
The goal of this thesis is the study of a tool that can help analysts in finding sequential patterns. This tool will have a focus on financial markets. A study will be made on how new and relevant knowledge can be mined from real life information, potentially giving investors, market analysts, and economists new basis to make informed decisions. The Ramex Forum algorithm will be used as a basis for the tool, due to its ability to find sequential patterns in financial data. So that it further adapts to the needs of the thesis, a study of relevant improvements to the algorithm will be made. Another important aspect of this algorithm is the way that it displays the patterns found, even with good results it is difficult to find relevant patterns among all the studied samples without a proper result visualization component. As such, different combinations of parameterizations and ways to visualize data will be evaluated and their influence in the analysis of those patterns will be discussed. In order to properly evaluate the utility of this tool, case studies will be performed as a final test. Real information will be used to produce results and those will be evaluated in regards to their accuracy, interest, and relevance.
Resumo:
The present paper reports the precipitation process of Al3Sc structures in an aluminum scandium alloy, which has been simulated with a synchronous parallel kinetic Monte Carlo (spkMC) algorithm. The spkMC implementation is based on the vacancy diffusion mechanism. To filter the raw data generated by the spkMC simulations, the density-based clustering with noise (DBSCAN) method has been employed. spkMC and DBSCAN algorithms were implemented in the C language and using MPI library. The simulations were conducted in the SeARCH cluster located at the University of Minho. The Al3Sc precipitation was successfully simulated at the atomistic scale with the spkMC. DBSCAN proved to be a valuable aid to identify the precipitates by performing a cluster analysis of the simulation results. The achieved simulations results are in good agreement with those reported in the literature under sequential kinetic Monte Carlo simulations (kMC). The parallel implementation of kMC has provided a 4x speedup over the sequential version.