958 resultados para Extremal polynomial ultraspherical polynomials
A Legendre spectral element model for sloshing and acoustic analysis in nearly incompressible fluids
Resumo:
A new spectral finite element formulation is presented for modeling the sloshing and the acoustic waves in nearly incompressible fluids. The formulation makes use of the Legendre polynomials in deriving the finite element interpolation shape functions in the Lagrangian frame of reference. The formulated element uses Gauss-Lobatto-Legendre quadrature scheme for integrating the volumetric stiffness and the mass matrices while the conventional Gauss-Legendre quadrature scheme is used on the rotational stiffness matrix to completely eliminate the zero energy modes, which are normally associated with the Lagrangian FE formulation. The numerical performance of the spectral element formulated here is examined by doing the inf-sup test oil a standard rectangular rigid tank partially filled with liquid The eigenvalues obtained from the formulated spectral element are compared with the conventional equally spaced node locations of the h-type Lagrangian finite element and the predicted results show that these spectral elements are more accurate and give superior convergence The efficiency and robustness of the formulated elements are demonstrated by solving few standard problems involving free vibration and dynamic response analysis with undistorted and distorted spectral elements. and the obtained results are compared with available results in the published literature (C) 2009 Elsevier Inc All rights reserved
Resumo:
In this paper we propose a novel family of kernels for multivariate time-series classification problems. Each time-series is approximated by a linear combination of piecewise polynomial functions in a Reproducing Kernel Hilbert Space by a novel kernel interpolation technique. Using the associated kernel function a large margin classification formulation is proposed which can discriminate between two classes. The formulation leads to kernels, between two multivariate time-series, which can be efficiently computed. The kernels have been successfully applied to writer independent handwritten character recognition.
Resumo:
The projection construction has been used to construct semifields of odd characteristic using a field and a twisted semifield [Commutative semi-fields from projection mappings, Designs, Codes and Cryptography, 61 (2011), 187{196]. We generalize this idea to a projection construction using two twisted semifields to construct semifields of odd characteristic. Planar functions and semifields have a strong connection so this also constructs new planar functions.
Resumo:
Let G = (V,E) be a simple, finite, undirected graph. For S ⊆ V, let $\delta(S,G) = \{ (u,v) \in E : u \in S \mbox { and } v \in V-S \}$ and $\phi(S,G) = \{ v \in V -S: \exists u \in S$ , such that (u,v) ∈ E} be the edge and vertex boundary of S, respectively. Given an integer i, 1 ≤ i ≤ ∣ V ∣, the edge and vertex isoperimetric value at i is defined as b e (i,G) = min S ⊆ V; |S| = i |δ(S,G)| and b v (i,G) = min S ⊆ V; |S| = i |φ(S,G)|, respectively. The edge (vertex) isoperimetric problem is to determine the value of b e (i, G) (b v (i, G)) for each i, 1 ≤ i ≤ |V|. If we have the further restriction that the set S should induce a connected subgraph of G, then the corresponding variation of the isoperimetric problem is known as the connected isoperimetric problem. The connected edge (vertex) isoperimetric values are defined in a corresponding way. It turns out that the connected edge isoperimetric and the connected vertex isoperimetric values are equal at each i, 1 ≤ i ≤ |V|, if G is a tree. Therefore we use the notation b c (i, T) to denote the connected edge (vertex) isoperimetric value of T at i. Hofstadter had introduced the interesting concept of meta-fibonacci sequences in his famous book “Gödel, Escher, Bach. An Eternal Golden Braid”. The sequence he introduced is known as the Hofstadter sequences and most of the problems he raised regarding this sequence is still open. Since then mathematicians studied many other closely related meta-fibonacci sequences such as Tanny sequences, Conway sequences, Conolly sequences etc. Let T 2 be an infinite complete binary tree. In this paper we related the connected isoperimetric problem on T 2 with the Tanny sequences which is defined by the recurrence relation a(i) = a(i − 1 − a(i − 1)) + a(i − 2 − a(i − 2)), a(0) = a(1) = a(2) = 1. In particular, we show that b c (i, T 2) = i + 2 − 2a(i), for each i ≥ 1. We also propose efficient polynomial time algorithms to find vertex isoperimetric values at i of bounded pathwidth and bounded treewidth graphs.
Resumo:
The problem of sensor-network-based distributed intrusion detection in the presence of clutter is considered. It is argued that sensing is best regarded as a local phenomenon in that only sensors in the immediate vicinity of an intruder are triggered. In such a setting, lack of knowledge of intruder location gives rise to correlated sensor readings. A signal-space viewpoint is introduced in which the noise-free sensor readings associated to intruder and clutter appear as surfaces $\mathcal{S_I}$ and $\mathcal{S_C}$ and the problem reduces to one of determining in distributed fashion, whether the current noisy sensor reading is best classified as intruder or clutter. Two approaches to distributed detection are pursued. In the first, a decision surface separating $\mathcal{S_I}$ and $\mathcal{S_C}$ is identified using Neyman-Pearson criteria. Thereafter, the individual sensor nodes interactively exchange bits to determine whether the sensor readings are on one side or the other of the decision surface. Bounds on the number of bits needed to be exchanged are derived, based on communication complexity (CC) theory. A lower bound derived for the two-party average case CC of general functions is compared against the performance of a greedy algorithm. The average case CC of the relevant greater-than (GT) function is characterized within two bits. In the second approach, each sensor node broadcasts a single bit arising from appropriate two-level quantization of its own sensor reading, keeping in mind the fusion rule to be subsequently applied at a local fusion center. The optimality of a threshold test as a quantization rule is proved under simplifying assumptions. Finally, results from a QualNet simulation of the algorithms are presented that include intruder tracking using a naive polynomial-regression algorithm.
Resumo:
In modern wireline and wireless communication systems, Viterbi decoder is one of the most compute intensive and essential elements. Each standard requires a different configuration of Viterbi decoder. Hence there is a need to design a flexible reconfigurable Viterbi decoder to support different configurations on a single platform. In this paper we present a reconfigurable Viterbi decoder which can be reconfigured for standards such as WCDMA, CDMA2000, IEEE 802.11, DAB, DVB, and GSM. Different parameters like code rate, constraint length, polynomials and truncation length can be configured to map any of the above mentioned standards. Our design provides higher throughput and scalable power consumption in various configuration of the reconfigurable Viterbi decoder. The power and throughput can also be optimized for different standards.
Resumo:
The problem of constructing space-time (ST) block codes over a fixed, desired signal constellation is considered. In this situation, there is a tradeoff between the transmission rate as measured in constellation symbols per channel use and the transmit diversity gain achieved by the code. The transmit diversity is a measure of the rate of polynomial decay of pairwise error probability of the code with increase in the signal-to-noise ratio (SNR). In the setting of a quasi-static channel model, let n(t) denote the number of transmit antennas and T the block interval. For any n(t) <= T, a unified construction of (n(t) x T) ST codes is provided here, for a class of signal constellations that includes the familiar pulse-amplitude (PAM), quadrature-amplitude (QAM), and 2(K)-ary phase-shift-keying (PSK) modulations as special cases. The construction is optimal as measured by the rate-diversity tradeoff and can achieve any given integer point on the rate-diversity tradeoff curve. An estimate of the coding gain realized is given. Other results presented here include i) an extension of the optimal unified construction to the multiple fading block case, ii) a version of the optimal unified construction in which the underlying binary block codes are replaced by trellis codes, iii) the providing of a linear dispersion form for the underlying binary block codes, iv) a Gray-mapped version of the unified construction, and v) a generalization of construction of the S-ary case corresponding to constellations of size S-K. Items ii) and iii) are aimed at simplifying the decoding of this class of ST codes.
Resumo:
This thesis consists of an introduction, four research articles and an appendix. The thesis studies relations between two different approaches to continuum limit of models of two dimensional statistical mechanics at criticality. The approach of conformal field theory (CFT) could be thought of as the algebraic classification of some basic objects in these models. It has been succesfully used by physicists since 1980's. The other approach, Schramm-Loewner evolutions (SLEs), is a recently introduced set of mathematical methods to study random curves or interfaces occurring in the continuum limit of the models. The first and second included articles argue on basis of statistical mechanics what would be a plausible relation between SLEs and conformal field theory. The first article studies multiple SLEs, several random curves simultaneously in a domain. The proposed definition is compatible with a natural commutation requirement suggested by Dubédat. The curves of multiple SLE may form different topological configurations, ``pure geometries''. We conjecture a relation between the topological configurations and CFT concepts of conformal blocks and operator product expansions. Example applications of multiple SLEs include crossing probabilities for percolation and Ising model. The second article studies SLE variants that represent models with boundary conditions implemented by primary fields. The most well known of these, SLE(kappa, rho), is shown to be simple in terms of the Coulomb gas formalism of CFT. In the third article the space of local martingales for variants of SLE is shown to carry a representation of Virasoro algebra. Finding this structure is guided by the relation of SLEs and CFTs in general, but the result is established in a straightforward fashion. This article, too, emphasizes multiple SLEs and proposes a possible way of treating pure geometries in terms of Coulomb gas. The fourth article states results of applications of the Virasoro structure to the open questions of SLE reversibility and duality. Proofs of the stated results are provided in the appendix. The objective is an indirect computation of certain polynomial expected values. Provided that these expected values exist, in generic cases they are shown to possess the desired properties, thus giving support for both reversibility and duality.
Resumo:
The widespread deployment of commercial-scale cellulosic ethanol currently hinges on developing and evaluating scalable processes whilst broadening feedstock options. This study investigates whole Eucalyptus grandis trees as a potential feedstock and demonstrates dilute acid pre-treatment (with steam explosion) followed by pre-saccharification simultaneous saccharification fermentation process (PSSF) as a suitable, scalable strategy for the production of bioethanol. Biomass was pre-treated in dilute H2SO4 at laboratory scale (0.1 kg) and pilot scale (10 kg) to evaluate the effect of combined severity factor (CSF) on pre-treatment effectiveness. Subsequently, pilot-scale pre-treated residues (15 wt.%) were converted to ethanol in a PSSF process at 2 L and 300 L scales. Good polynomial correlations (n = 2) of CSF with hemicellulose removal and glucan digestibility with a minimum R2 of 0.91 were recorded. The laboratory-scale 72 h glucan digestibility and glucose yield was 68.0% and 51.3%, respectively, from biomass pre-treated at 190 °C /15 min/ 4.8 wt.% H2SO4. Pilot-scale pre-treatment (180 °C/ 15 min/2.4 wt.% H2SO4 followed by steam explosion) delivered higher glucan digestibility (71.8%) and glucose yield (63.6%). However, the ethanol yields using PSSF were calculated at 82.5 and 113 kg/ton of dry biomass for the pilot and the laboratory scales, respectively. © 2016 Society of Chemical Industry and John Wiley & Sons, Ltd
Resumo:
The purpose of this article is to report the experience of design and testing of orifice plate-based flow measuring systems for evaluation of air leakages in components of air conditioning systems. Two of the flow measuring stations were designed with a beta value of 0.405 and 0.418. The third was a dual path unit with orifice plates of beta value 0.613 and 0.525. The flow rates covered with all the four were from 4-94 l/s and the range of Reynolds numbers is from 5600 to 76,000. The coefficients of discharge were evaluated and compared with the Stolz equation. Measured C-d values are generally higher than those obtained from the equation, the deviations being larger in the low Reynolds number region. Further, it is observed that a second-degree polynomial is inadequate to relate the pressure drop and flow rate. The lower Reynolds number limits set by standards appear to be somewhat conservative.
Resumo:
The objective is to present the formulation of numerically integrated modified virtual crack closure integral technique for concentrically and eccentrically stiffened panels for computation of strain-energy release rate and stress intensity factor based on linear elastic fracture mechanics principles. Fracture analysis of cracked stiffened panels under combined tensile, bending, and shear loads has been conducted by employing the stiffened plate/shell finite element model, MQL9S2. This model can be used to analyze plates with arbitrarily located concentric/eccentric stiffeners, without increasing the total number of degrees of freedom, of the plate element. Parametric studies on fracture analysis of stiffened plates under combined tensile and moment loads have been conducted. Based on the results of parametric,studies, polynomial curve fitting has been carried out to get best-fit equations corresponding to each of the stiffener positions. These equations can be used for computation of stress intensity factor for cracked stiffened plates subjected to tensile and moment loads for a given plate size, stiffener configuration, and stiffener position without conducting finite element analysis.
Resumo:
The possibility of applying two approximate methods for determining the salient features of response of undamped non-linear spring mass systems subjected to a step input, is examined. The results obtained on the basis of these approximate methods are compared with the exact results that are available for some particular types of spring characteristics. The extension of the approximate methods for non-linear systems with general polynomial restoring force characteristics is indicated.
Resumo:
The paper deals with a linearization technique in non-linear oscillations for systems which are governed by second-order non-linear ordinary differential equations. The method is based on approximation of the non-linear function by a linear function such that the error is least in the weighted mean square sense. The method has been applied to cubic, sine, hyperbolic sine, and odd polynomial types of non-linearities and the results obtained are more accurate than those given by existing linearization methods.
Resumo:
In recent years a large number of investigators have devoted their efforts to the study of flow and heat transfer in rarefied gases, using the BGK [1] model or the Boltzmann kinetic equation. The velocity moment method which is based on an expansion of the distribution function as a series of orthogonal polynomials in velocity space, has been applied to the linearized problem of shear flow and heat transfer by Mott-Smith [2] and Wang Chang and Uhlenbeck [3]. Gross, Jackson and Ziering [4] have improved greatly upon this technique by expressing the distribution function in terms of half-range functions and it is this feature which leads to the rapid convergence of the method. The full-range moments method [4] has been modified by Bhatnagar [5] and then applied to plane Couette flow using the B-G-K model. Bhatnagar and Srivastava [6] have also studied the heat transfer in plane Couette flow using the linearized B-G-K equation. On the other hand, the half-range moments method has been applied by Gross and Ziering [7] to heat transfer between parallel plates using Boltzmann equation for hard sphere molecules and by Ziering [83 to shear and heat flow using Maxwell molecular model. Along different lines, a moment method has been applied by Lees and Liu [9] to heat transfer in Couette flow using Maxwell's transfer equation rather than the Boltzmann equation for distribution function. An iteration method has been developed by Willis [10] to apply it to non-linear heat transfer problems using the B-G-K model, with the zeroth iteration being taken as the solution of the collisionless kinetic equation. Krook [11] has also used the moment method to formulate the equivalent continuum equations and has pointed out that if the effects of molecular collisions are described by the B-G-K model, exact numerical solutions of many rarefied gas-dynamic problems can be obtained. Recently, these numerical solutions have been obtained by Anderson [12] for the non-linear heat transfer in Couette flow,
Resumo:
The force constants of H2 and Li2 are evaluated employing their extended Hartree-Fock wavefunctions by a polynomial fit of their force curves. It is suggested that, based on incomplete multiconfiguration Hartree-Fock wavefunctions, force constants calculated from the energy derivatives are numerically more accurate than those obtained from the derivatives of the Hellmann-Feynman forces. It is observed that electrons relax during the nuclear vibrations in such a fashion as to facilitate the nuclear motions.