45 resultados para Leibniz Algebras with Polynomial Identities


Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider the problem of minimizing the total completion time on a single batch processing machine. The set of jobs to be scheduled can be partitioned into a number of families, where all jobs in the same family have the same processing time. The machine can process at most B jobs simultaneously as a batch, and the processing time of a batch is equal to the processing time of the longest job in the batch. We analyze that properties of an optimal schedule and develop a dynamic programming algorithm of polynomial time complexity when the number of job families is fixed. The research is motivated by the problem of scheduling burn-in ovens in the semiconductor industry

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider the problem of matching people to items, where each person ranks a subset of items in an order of preference, possibly involving ties. There are several notions of optimality about how to best match a person to an item; in particular, popularity is a natural and appealing notion of optimality. A matching M* is popular if there is no matching M such that the number of people who prefer M to M* exceeds the number who prefer M* to M. However, popular matchings do not always provide an answer to the problem of determining an optimal matching since there are simple instances that do not admit popular matchings. This motivates the following extension of the popular matchings problem: Given a graph G = (A U 3, E) where A is the set of people and 2 is the set of items, and a list < c(1),...., c(vertical bar B vertical bar)> denoting upper bounds on the number of copies of each item, does there exist < x(1),...., x(vertical bar B vertical bar)> such that for each i, having x(i) copies of the i-th item, where 1 <= xi <= c(i), enables the resulting graph to admit a popular matching? In this paper we show that the above problem is NP-hard. We show that the problem is NP-hard even when each c(i) is 1 or 2. We show a polynomial time algorithm for a variant of the above problem where the total increase in copies is bounded by an integer k. (C) 2011 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

For the number of transmit antennas N = 2(a) the maximum rate (in complex symbols per channel use) of all the Quasi-Orthogonal Designs (QODs) reported in the literature is a/2(a)-1. In this paper, we report double-symbol-decodable Space-Time Block Codes with rate a-1/2(a)-2 for N = 2(a) transmit antennas. In particular, our code for 8 and 16 transmit antennas offer rates 1 and 3/4 respectively, the known QODs offer only 3/4 and 1/2 respectively. Our construction is based on the representations of Clifford algebras and applicable for any number of transmit antennas. We study the diversity sum and diversity product of our codes. We show that our diversity sum is larger than that of all known QODs and hence our codes perform better than the comparable QODs at low SNRs for identical spectral efficiency. We provide simulation results for various spectral efficiencies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study aims to determine optimal locations of dual trailing-edge flaps and blade stiffness to achieve minimum hub vibration levels in a helicopter, with low penalty in terms of required trailing-edge flap control power. An aeroelastic analysis based on finite elements in space and time is used in conjunction with an optimal control algorithm to determine the flap time history for vibration minimization. Using the aeroelastic analysis, it is found that the objective functions are highly nonlinear and polynomial response surface approximations cannot describe the objectives adequately. A neural network is then used for approximating the objective functions for optimization. Pareto-optimal points minimizing both helicopter vibration and flap power ale obtained using the response surface and neural network metamodels. The two metamodels give useful improved designs resulting in about 27% reduction in hub vibration and about 45% reduction in flap power. However, the design obtained using response surface is less sensitive to small perturbations in the design variables.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The initial motivation for this paper is to discuss a more concrete approach to an approximation theorem of Axler and Shields, which says that the uniform algebra on the closed unit disc (D) over bar generated by z and h, where h is a nowhere-holomorphic harmonic function on D that is continuous up to partial derivative D, equals C((D) over bar). The abstract tools used by Axler and Shields make harmonicity of h an essential condition for their result. We use the concepts of plurisubharmonicity and polynomial convexity to show that, in fact, the same conclusion is reached if h is replaced by h + R, where R is a non-harmonic perturbation whose Laplacian is ``small'' in a certain sense.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Let G be an undirected graph with a positive real weight on each edge. It is shown that the number of minimum-weight cycles of G is bounded above by a polynomial in the number of edges of G. A similar bound holds if we wish to count the number of cycles with weight at most a constant multiple of the minimum weight of a cycle of G.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We address the problem of local-polynomial modeling of smooth time-varying signals with unknown functional form, in the presence of additive noise. The problem formulation is in the time domain and the polynomial coefficients are estimated in the pointwise minimum mean square error (PMMSE) sense. The choice of the window length for local modeling introduces a bias-variance tradeoff, which we solve optimally by using the intersection-of-confidence-intervals (ICI) technique. The combination of the local polynomial model and the ICI technique gives rise to an adaptive signal model equipped with a time-varying PMMSE-optimal window length whose performance is superior to that obtained by using a fixed window length. We also evaluate the sensitivity of the ICI technique with respect to the confidence interval width. Simulation results on electrocardiogram (ECG) signals show that at 0dB signal-to-noise ratio (SNR), one can achieve about 12dB improvement in SNR. Monte-Carlo performance analysis shows that the performance is comparable to the basic wavelet techniques. For 0 dB SNR, the adaptive window technique yields about 2-3dB higher SNR than wavelet regression techniques and for SNRs greater than 12dB, the wavelet techniques yield about 2dB higher SNR.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Web services are now a key ingredient of software services offered by software enterprises. Many standardized web services are now available as commodity offerings from web service providers. An important problem for a web service requester is the web service composition problem which involves selecting the right mix of web service offerings to execute an end-to-end business process. Web service offerings are now available in bundled form as composite web services and more recently, volume discounts are also on offer, based on the number of executions of web services requested. In this paper, we develop efficient algorithms for the web service composition problem in the presence of composite web service offerings and volume discounts. We model this problem as a combinatorial auction with volume discounts. We first develop efficient polynomial time algorithms when the end-to-end service involves a linear workflow of web services. Next we develop efficient polynomial time algorithms when the end-to-end service involves a tree workflow of web services.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract—A method of testing for parametric faults of analog circuits based on a polynomial representaion of fault-free function of the circuit is presented. The response of the circuit under test (CUT) is estimated as a polynomial in the applied input voltage at relevant frequencies apart from DC. Classification of CUT is based on a comparison of the estimated polynomial coefficients with those of the fault free circuit. The method needs very little augmentation of circuit to make it testable as only output parameters are used for classification. This procedure is shown to uncover several parametric faults causing smaller than 5 % deviations the nominal values. Fault diagnosis based upon sensitivity of polynomial coefficients at relevant frequencies is also proposed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A Space-Time Block Code (STBC) in K symbols (variables) is called g-group decodable STBC if its maximum-likelihood decoding metric can be written as a sum of g terms such that each term is a function of a subset of the K variables and each variable appears in only one term. In this paper we provide a general structure of the weight matrices of multi-group decodable codes using Clifford algebras. Without assuming that the number of variables in each group to be the same, a method of explicitly constructing the weight matrices of full-diversity, delay-optimal g-group decodable codes is presented for arbitrary number of antennas. For the special case of Nt=2a we construct two subclass of codes: (i) A class of 2a-group decodable codes with rate a2(a−1), which is, equivalently, a class of Single-Symbol Decodable codes, (ii) A class of (2a−2)-group decodable with rate (a−1)2(a−2), i.e., a class of Double-Symbol Decodable codes. Simulation results show that the DSD codes of this paper perform better than previously known Quasi-Orthogonal Designs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The repeated or closely spaced eigenvalues and corresponding eigenvectors of a matrix are usually very sensitive to a perturbation of the matrix, which makes capturing the behavior of these eigenpairs very difficult. Similar difficulty is encountered in solving the random eigenvalue problem when a matrix with random elements has a set of clustered eigenvalues in its mean. In addition, the methods to solve the random eigenvalue problem often differ in characterizing the problem, which leads to different interpretations of the solution. Thus, the solutions obtained from different methods become mathematically incomparable. These two issues, the difficulty of solving and the non-unique characterization, are addressed here. A different approach is used where instead of tracking a few individual eigenpairs, the corresponding invariant subspace is tracked. The spectral stochastic finite element method is used for analysis, where the polynomial chaos expansion is used to represent the random eigenvalues and eigenvectors. However, the main concept of tracking the invariant subspace remains mostly independent of any such representation. The approach is successfully implemented in response prediction of a system with repeated natural frequencies. It is found that tracking only an invariant subspace could be sufficient to build a modal-based reduced-order model of the system. Copyright (C) 2012 John Wiley & Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Let be a smooth real surface in and let be a point at which the tangent plane is a complex line. How does one determine whether or not is locally polynomially convex at such a p-i.e. at a CR singularity? Even when the order of contact of with at p equals 2, no clean characterisation exists; difficulties are posed by parabolic points. Hence, we study non-parabolic CR singularities. We show that the presence or absence of Bishop discs around certain non-parabolic CR singularities is completely determined by a Maslov-type index. This result subsumes all known facts about Bishop discs around order-two, non-parabolic CR singularities. Sufficient conditions for Bishop discs have earlier been investigated at CR singularities having high order of contact with . These results relied upon a subharmonicity condition, which fails in many simple cases. Hence, we look beyond potential theory and refine certain ideas going back to Bishop.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We propose a distribution-free approach to the study of random geometric graphs. The distribution of vertices follows a Poisson point process with intensity function n f(center dot), where n is an element of N, and f is a probability density function on R-d. A vertex located at x connects via directed edges to other vertices that are within a cut-off distance r(n)(x). We prove strong law results for (i) the critical cut-off function so that almost surely, the graph does not contain any node with out-degree zero for sufficiently large n and (ii) the maximum and minimum vertex degrees. We also provide a characterization of the cut-off function for which the number of nodes with out-degree zero converges in distribution to a Poisson random variable. We illustrate this result for a class of densities with compact support that have at most polynomial rates of decay to zero. Finally, we state a sufficient condition for an enhanced version of the above graph to be almost surely connected eventually.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we discuss SU(N) Chern-Simons theories at level k with both fermionic and bosonic vector matter. In particular we present an exact calculation of the free energy of the N = 2 supersymmetric model (with one chiral field) for all values of the `t Hooft coupling in the large N limit. This is done by using a generalization of the standard Hubbard-Stratanovich method because the SUSY model contains higher order polynomial interactions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider a complex, additive, white Gaussian noise channel with flat fading. We study its diversity order vs transmission rate for some known power allocation schemes. The capacity region is divided into three regions. For one power allocation scheme, the diversity order is exponential throughout the capacity region. For selective channel inversion (SCI) scheme, the diversity order is exponential in low and high rate region but polynomial in mid rate region. For fast fading case we also provide a new upper bound on block error probability and a power allocation scheme that minimizes it. The diversity order behaviour of this scheme is same as for SCI but provides lower BER than the other policies.