1000 resultados para recursive problems
Resumo:
Foot problems complicating diabetes are a source of major patient suffering and societal costs. Investing in evidence-based, internationally appropriate diabetic foot care guidance is likely among the most cost-effective forms of healthcare expenditure, provided it is goal-focused and properly implemented. The International Working Group on the Diabetic Foot (IWGDF) has been publishing and updating international Practical Guidelines since 1999. The 2015 updates are based on systematic reviews of the literature, and recommendations are formulated using the Grading of Recommendations Assessment Development and Evaluation system. As such, we changed the name from 'Practical Guidelines' to 'Guidance'. In this article we describe the development of the 2015 IWGDF Guidance documents on prevention and management of foot problems in diabetes. This Guidance consists of five documents, prepared by five working groups of international experts. These documents provide guidance related to foot complications in persons with diabetes on: prevention; footwear and offloading; peripheral artery disease; infections; and, wound healing interventions. Based on these five documents, the IWGDF Editorial Board produced a summary guidance for daily practice. The resultant of this process, after reviewed by the Editorial Board and by international IWGDF members of all documents, is an evidence-based global consensus on prevention and management of foot problems in diabetes. Plans are already under way to implement this Guidance. We believe that following the recommendations of the 2015 IWGDF Guidance will almost certainly result in improved management of foot problems in persons with diabetes and a subsequent worldwide reduction in the tragedies caused by these foot problems.
Resumo:
An error-free computational approach is employed for finding the integer solution to a system of linear equations, using finite-field arithmetic. This approach is also extended to find the optimum solution for linear inequalities such as those arising in interval linear programming probloms.
Resumo:
The remarkable advances made in recombinant DNA technology over the last two decades have paved way for the use of gene transfer to treat human diseases. Several protocols have been developed for the introduction and expression of genes in humans, but the clinical efficacy has not been conclusively demonstrated in any of them. The eventual success of gene therapy for genetic and acquired disorders depends on the development of better gene transfer vectors for sustained, long term expression of foreign genes as well as a better understanding of the pathophysiology of human diseases, it is heartening to note that some of the gene therapy protocols have found other applications such as the genetic immunization or DNA vaccines, which is being heralded as the third vaccine revolution, Gene therapy is yet to become a dream come true, but the light is seen at the end of the tunnel.
Resumo:
Site-specific geotechnical data are always random and variable in space. In the present study, a procedure for quantifying the variability in geotechnical characterization and design parameters is discussed using the site-specific cone tip resistance data (qc) obtained from static cone penetration test (SCPT). The parameters for the spatial variability modeling of geotechnical parameters i.e. (i) existing trend function in the in situ qc data; (ii) second moment statistics i.e. analysis of mean, variance, and auto-correlation structure of the soil strength and stiffness parameters; and (iii) inputs from the spatial correlation analysis, are utilized in the numerical modeling procedures using the finite difference numerical code FLAC 5.0. The influence of consideration of spatially variable soil parameters on the reliability-based geotechnical deign is studied for the two cases i.e. (a) bearing capacity analysis of a shallow foundation resting on a clayey soil, and (b) analysis of stability and deformation pattern of a cohesive-frictional soil slope. The study highlights the procedure for conducting a site-specific study using field test data such as SCPT in geotechnical analysis and demonstrates that a few additional computations involving soil variability provide a better insight into the role of variability in designs.
Resumo:
In this note, the fallacy in the method given by Sharma and Swarup, in their paper on time minimising transportation problem, to determine the setS hkof all nonbasic cells which when introduced into the basis, either would eliminate a given basic cell (h, k) from the basis or reduce the amountx hkis pointed out.
An FETI-preconditioned conjuerate gradient method for large-scale stochastic finite element problems
Resumo:
In the spectral stochastic finite element method for analyzing an uncertain system. the uncertainty is represented by a set of random variables, and a quantity of Interest such as the system response is considered as a function of these random variables Consequently, the underlying Galerkin projection yields a block system of deterministic equations where the blocks are sparse but coupled. The solution of this algebraic system of equations becomes rapidly challenging when the size of the physical system and/or the level of uncertainty is increased This paper addresses this challenge by presenting a preconditioned conjugate gradient method for such block systems where the preconditioning step is based on the dual-primal finite element tearing and interconnecting method equipped with a Krylov subspace reusage technique for accelerating the iterative solution of systems with multiple and repeated right-hand sides. Preliminary performance results on a Linux Cluster suggest that the proposed Solution method is numerically scalable and demonstrate its potential for making the uncertainty quantification Of realistic systems tractable.
Resumo:
We propose a self-regularized pseudo-time marching strategy for ill-posed, nonlinear inverse problems involving recovery of system parameters given partial and noisy measurements of system response. While various regularized Newton methods are popularly employed to solve these problems, resulting solutions are known to sensitively depend upon the noise intensity in the data and on regularization parameters, an optimal choice for which remains a tricky issue. Through limited numerical experiments on a couple of parameter re-construction problems, one involving the identification of a truss bridge and the other related to imaging soft-tissue organs for early detection of cancer, we demonstrate the superior features of the pseudo-time marching schemes.
Resumo:
The importance of neurochemistry in understanding the functional basis of the nervous system was emphasized. Attention was drawn to the role of lipids, particularly the sphingolipids,whose metabolic abnormalities lead to 'sphingolipidosis' In the brain and to gangliosides, which show growth-promoting and neuritogenic properties. Several questions that remain to be answered in this area were enumerated. It was pointed out that neurons make a large number of proteins, an order of magnitude higher than other cells, and several of these are yet to be characterized and their functional significance established. Myelination and synapto-genesis are two fundamental processes in brain development. Although much is known about myelin lipids and proteins, it is not known what signals the glial cell receives to initiate myelin synthesis around the axon, In fact, the process of myelination provides an excellent system for studying membrane biogenesis and cell-sell interaction. Great strides were made in the understanding of neurotransmitter receptors and their function in synaptic transmission, but how neurons make synapses with other specific neurons in a preprogrammed manner is not known and requires immediate study. In this context, it was stressed that developmental neurobiology of the human brain could be most profitably done in India. The importance and complexity of signal transduction mechanisms in the brain was explained and many fundamental questions that remain to be answered were discussed. In conclusion, several other areas of contemporary research interest in the nervous system were mentioned and it was suggested that a 'National Committee for Brain Research' be constituted to identify and intensify research programmes in this vital field.
Resumo:
We have proposed a general method for finding the exact analytical solution for the multi-channel curve crossing problem in the presence of delta function couplings. We have analysed the case where aa potential energy curve couples to a continuum (in energy) of the potential energy curves.
Resumo:
Past studies that have compared LBB stable discontinuous- and continuous-pressure finite element formulations on a variety of problems have concluded that both methods yield Solutions of comparable accuracy, and that the choice of interpolation is dictated by which of the two is more efficient. In this work, we show that using discontinuous-pressure interpolations can yield inaccurate solutions at large times on a class of transient problems, while the continuous-pressure formulation yields solutions that are in good agreement with the analytical Solution.
Resumo:
The removal of noise and outliers from health signals is an important problem in jet engine health monitoring. Typically, health signals are time series of damage indicators, which can be sensor measurements or features derived from such measurements. Sharp or sudden changes in health signals can represent abrupt faults and long term deterioration in the system is typical of gradual faults. Simple linear filters tend to smooth out the sharp trend shifts in jet engine signals and are also not good for outlier removal. We propose new optimally designed nonlinear weighted recursive median filters for noise removal from typical health signals of jet engines. Signals for abrupt and gradual faults and with transient data are considered. Numerical results are obtained for a jet engine and show that preprocessing of health signals using the proposed filter significantly removes Gaussian noise and outliers and could therefore greatly improve the accuracy of diagnostic systems. [DOI: 10.1115/1.3200907].
Resumo:
A considerable amount of work has been dedicated on the development of analytical solutions for flow of chemical contaminants through soils. Most of the analytical solutions for complex transport problems are closed-form series solutions. The convergence of these solutions depends on the eigen values obtained from a corresponding transcendental equation. Thus, the difficulty in obtaining exact solutions from analytical models encourages the use of numerical solutions for the parameter estimation even though, the later models are computationally expensive. In this paper a combination of two swarm intelligence based algorithms are used for accurate estimation of design transport parameters from the closed-form analytical solutions. Estimation of eigen values from a transcendental equation is treated as a multimodal discontinuous function optimization problem. The eigen values are estimated using an algorithm derived based on glowworm swarm strategy. Parameter estimation of the inverse problem is handled using standard PSO algorithm. Integration of these two algorithms enables an accurate estimation of design parameters using closed-form analytical solutions. The present solver is applied to a real world inverse problem in environmental engineering. The inverse model based on swarm intelligence techniques is validated and the accuracy in parameter estimation is shown. The proposed solver quickly estimates the design parameters with a great precision.
Resumo:
We consider the problem of determining if two finite groups are isomorphic. The groups are assumed to be represented by their multiplication tables. We present an O(n) algorithm that determines if two Abelian groups with n elements each are isomorphic. This improves upon the previous upper bound of O(n log n) [Narayan Vikas, An O(n) algorithm for Abelian p-group isomorphism and an O(n log n) algorithm for Abelian group isomorphism, J. Comput. System Sci. 53 (1996) 1-9] known for this problem. We solve a more general problem of computing the orders of all the elements of any group (not necessarily Abelian) of size n in O(n) time. Our algorithm for isomorphism testing of Abelian groups follows from this result. We use the property that our order finding algorithm works for any group to design a simple O(n) algorithm for testing whether a group of size n, described by its multiplication table, is nilpotent. We also give an O(n) algorithm for determining if a group of size n, described by its multiplication table, is Abelian. (C) 2007 Elsevier Inc. All rights reserved.
Resumo:
Solving large-scale all-to-all comparison problems using distributed computing is increasingly significant for various applications. Previous efforts to implement distributed all-to-all comparison frameworks have treated the two phases of data distribution and comparison task scheduling separately. This leads to high storage demands as well as poor data locality for the comparison tasks, thus creating a need to redistribute the data at runtime. Furthermore, most previous methods have been developed for homogeneous computing environments, so their overall performance is degraded even further when they are used in heterogeneous distributed systems. To tackle these challenges, this paper presents a data-aware task scheduling approach for solving all-to-all comparison problems in heterogeneous distributed systems. The approach formulates the requirements for data distribution and comparison task scheduling simultaneously as a constrained optimization problem. Then, metaheuristic data pre-scheduling and dynamic task scheduling strategies are developed along with an algorithmic implementation to solve the problem. The approach provides perfect data locality for all comparison tasks, avoiding rearrangement of data at runtime. It achieves load balancing among heterogeneous computing nodes, thus enhancing the overall computation time. It also reduces data storage requirements across the network. The effectiveness of the approach is demonstrated through experimental studies.