149 resultados para algebraic problems
Resumo:
A numerical procedure, based on the parametric differentiation and implicit finite difference scheme, has been developed for a class of problems in the boundary-layer theory for saddle-point regions. Here, the results are presented for the case of a three-dimensional stagnation-point flow with massive blowing. The method compares very well with other methods for particular cases (zero or small mass blowing). Results emphasize that the present numerical procedure is well suited for the solution of saddle-point flows with massive blowing, which could not be solved by other methods.
Resumo:
Two different matrix algorithms are described for the restoration of blurred pictures. These are illustrated by numerical examples.
Resumo:
An error-free computational approach is employed for finding the integer solution to a system of linear equations, using finite-field arithmetic. This approach is also extended to find the optimum solution for linear inequalities such as those arising in interval linear programming probloms.
Resumo:
The remarkable advances made in recombinant DNA technology over the last two decades have paved way for the use of gene transfer to treat human diseases. Several protocols have been developed for the introduction and expression of genes in humans, but the clinical efficacy has not been conclusively demonstrated in any of them. The eventual success of gene therapy for genetic and acquired disorders depends on the development of better gene transfer vectors for sustained, long term expression of foreign genes as well as a better understanding of the pathophysiology of human diseases, it is heartening to note that some of the gene therapy protocols have found other applications such as the genetic immunization or DNA vaccines, which is being heralded as the third vaccine revolution, Gene therapy is yet to become a dream come true, but the light is seen at the end of the tunnel.
Resumo:
Site-specific geotechnical data are always random and variable in space. In the present study, a procedure for quantifying the variability in geotechnical characterization and design parameters is discussed using the site-specific cone tip resistance data (qc) obtained from static cone penetration test (SCPT). The parameters for the spatial variability modeling of geotechnical parameters i.e. (i) existing trend function in the in situ qc data; (ii) second moment statistics i.e. analysis of mean, variance, and auto-correlation structure of the soil strength and stiffness parameters; and (iii) inputs from the spatial correlation analysis, are utilized in the numerical modeling procedures using the finite difference numerical code FLAC 5.0. The influence of consideration of spatially variable soil parameters on the reliability-based geotechnical deign is studied for the two cases i.e. (a) bearing capacity analysis of a shallow foundation resting on a clayey soil, and (b) analysis of stability and deformation pattern of a cohesive-frictional soil slope. The study highlights the procedure for conducting a site-specific study using field test data such as SCPT in geotechnical analysis and demonstrates that a few additional computations involving soil variability provide a better insight into the role of variability in designs.
Resumo:
In this note, the fallacy in the method given by Sharma and Swarup, in their paper on time minimising transportation problem, to determine the setS hkof all nonbasic cells which when introduced into the basis, either would eliminate a given basic cell (h, k) from the basis or reduce the amountx hkis pointed out.
Resumo:
We propose a self-regularized pseudo-time marching strategy for ill-posed, nonlinear inverse problems involving recovery of system parameters given partial and noisy measurements of system response. While various regularized Newton methods are popularly employed to solve these problems, resulting solutions are known to sensitively depend upon the noise intensity in the data and on regularization parameters, an optimal choice for which remains a tricky issue. Through limited numerical experiments on a couple of parameter re-construction problems, one involving the identification of a truss bridge and the other related to imaging soft-tissue organs for early detection of cancer, we demonstrate the superior features of the pseudo-time marching schemes.
Resumo:
The importance of neurochemistry in understanding the functional basis of the nervous system was emphasized. Attention was drawn to the role of lipids, particularly the sphingolipids,whose metabolic abnormalities lead to 'sphingolipidosis' In the brain and to gangliosides, which show growth-promoting and neuritogenic properties. Several questions that remain to be answered in this area were enumerated. It was pointed out that neurons make a large number of proteins, an order of magnitude higher than other cells, and several of these are yet to be characterized and their functional significance established. Myelination and synapto-genesis are two fundamental processes in brain development. Although much is known about myelin lipids and proteins, it is not known what signals the glial cell receives to initiate myelin synthesis around the axon, In fact, the process of myelination provides an excellent system for studying membrane biogenesis and cell-sell interaction. Great strides were made in the understanding of neurotransmitter receptors and their function in synaptic transmission, but how neurons make synapses with other specific neurons in a preprogrammed manner is not known and requires immediate study. In this context, it was stressed that developmental neurobiology of the human brain could be most profitably done in India. The importance and complexity of signal transduction mechanisms in the brain was explained and many fundamental questions that remain to be answered were discussed. In conclusion, several other areas of contemporary research interest in the nervous system were mentioned and it was suggested that a 'National Committee for Brain Research' be constituted to identify and intensify research programmes in this vital field.
Resumo:
We have proposed a general method for finding the exact analytical solution for the multi-channel curve crossing problem in the presence of delta function couplings. We have analysed the case where aa potential energy curve couples to a continuum (in energy) of the potential energy curves.
Resumo:
Past studies that have compared LBB stable discontinuous- and continuous-pressure finite element formulations on a variety of problems have concluded that both methods yield Solutions of comparable accuracy, and that the choice of interpolation is dictated by which of the two is more efficient. In this work, we show that using discontinuous-pressure interpolations can yield inaccurate solutions at large times on a class of transient problems, while the continuous-pressure formulation yields solutions that are in good agreement with the analytical Solution.
Resumo:
A considerable amount of work has been dedicated on the development of analytical solutions for flow of chemical contaminants through soils. Most of the analytical solutions for complex transport problems are closed-form series solutions. The convergence of these solutions depends on the eigen values obtained from a corresponding transcendental equation. Thus, the difficulty in obtaining exact solutions from analytical models encourages the use of numerical solutions for the parameter estimation even though, the later models are computationally expensive. In this paper a combination of two swarm intelligence based algorithms are used for accurate estimation of design transport parameters from the closed-form analytical solutions. Estimation of eigen values from a transcendental equation is treated as a multimodal discontinuous function optimization problem. The eigen values are estimated using an algorithm derived based on glowworm swarm strategy. Parameter estimation of the inverse problem is handled using standard PSO algorithm. Integration of these two algorithms enables an accurate estimation of design parameters using closed-form analytical solutions. The present solver is applied to a real world inverse problem in environmental engineering. The inverse model based on swarm intelligence techniques is validated and the accuracy in parameter estimation is shown. The proposed solver quickly estimates the design parameters with a great precision.
Resumo:
CTRU, a public key cryptosystem was proposed by Gaborit, Ohler and Sole. It is analogue of NTRU, the ring of integers replaced by the ring of polynomials $\mathbb{F}_2[T]$ . It attracted attention as the attacks based on either LLL algorithm or the Chinese Remainder Theorem are avoided on it, which is most common on NTRU. In this paper we presents a polynomial-time algorithm that breaks CTRU for all recommended parameter choices that were derived to make CTRU secure against popov normal form attack. The paper shows if we ascertain the constraints for perfect decryption then either plaintext or private key can be achieved by polynomial time linear algebra attack.
Resumo:
In this paper, we present an algebraic method to study and design spatial parallel manipulators that demonstrate isotropy in the force and moment distributions. We use the force and moment transformation matrices separately, and derive conditions for their isotropy individually as well as in combination. The isotropy conditions are derived in closed-form in terms of the invariants of the quadratic forms associated with these matrices. The formulation is applied to a class of Stewart platform manipulator, and a multi-parameter family of isotropic manipulators is identified analytically. We show that it is impossible to obtain a spatially isotropic configuration within this family. We also compute the isotropic configurations of an existing manipulator and demonstrate a procedure for designing the manipulator for isotropy at a given configuration. (C) 2008 Elsevier Ltd. All rights reserved.
Resumo:
"Extended Clifford algebras" are introduced as a means to obtain low ML decoding complexity space-time block codes. Using left regular matrix representations of two specific classes of extended Clifford algebras, two systematic algebraic constructions of full diversity Distributed Space-Time Codes (DSTCs) are provided for any power of two number of relays. The left regular matrix representation has been shown to naturally result in space-time codes meeting the additional constraints required for DSTCs. The DSTCs so constructed have the salient feature of reduced Maximum Likelihood (ML) decoding complexity. In particular, the ML decoding of these codes can be performed by applying the lattice decoder algorithm on a lattice of four times lesser dimension than what is required in general. Moreover these codes have a uniform distribution of power among the relays and in time, thus leading to a low Peak to Average Power Ratio at the relays.