959 resultados para Zeros of orthogonal polynomials
Resumo:
Os sistemas elétricos de potência modernos apresentam inúmeros desafios em sua operação. Nos sistemas de distribuição de energia elétrica, devido à grande ramificação, presença de extensos ramais monofásicos, à dinâmica das cargas e demais particularidades inerentes, a localização de faltas representa um dos maiores desafios. Das barreiras encontradas, a influência da impedância de falta é uma das maiores, afetando significativamente a aplicação dos métodos tradicionais na localização, visto que a magnitude das correntes de falta é similar à da corrente de carga. Neste sentido, esta tese objetivou desenvolver um sistema inteligente para localização de faltas de alta impedância, o qual foi embasado na aplicação da técnica de decomposição por componentes ortogonais no pré-processamento das variáveis e inferência fuzzy para interpretar as não-linearidades do Sistemas de Distribuição com presença de Geração Distribuída. Os dados para treinamento do sistema inteligente foram obtidos a partir de simulações computacionais de um alimentador real, considerando uma modelagem não-linear da falta de alta impedância. O sistema fuzzy resultante foi capaz de estimar as distâncias de falta com um erro absoluto médio inferior a 500 m e um erro absoluto máximo da ordem de 1,5 km, em um alimentador com cerca de 18 km de extensão. Tais resultados equivalem a um grau de exatidão, para a maior parte das ocorrências, dentro do intervalo de ±10%.
Resumo:
In some circumstances, there may be no scientific model of the relationship between X and Y that can be specified in advance and indeed the objective of the investigation may be to provide a ‘curve of best fit’ for predictive purposes. In such an example, the fitting of successive polynomials may be the best approach. There are various strategies to decide on the polynomial of best fit depending on the objectives of the investigation.
Resumo:
The application of orthogonal frequency-division multiplexing (OFDM) in an optical burst-switched system employing a single fast switching sample grating-distributed Bragg reflector (SG-DBR) laser is demonstrated experimentally. The effect of filter profiles compatible with 50, 25, and 12.5 GHz wavelength-division multiplexing grids on the system is investigated with system performance examined in terms of error vector magnitude per subcarrier for OFDM burst data beginning at various times after a switching event. Additionally the placement of the OFDM training sequence within the data burst and its effect on the system is investigated.
Resumo:
The following problem, suggested by Laguerre’s Theorem (1884), remains open: Characterize all real sequences {μk} k=0...∞ which have the zero-diminishing property; that is, if k=0...n, p(x) = ∑(ak x^k) is any P real polynomial, then k=0...n, p(x) = ∑(μk ak x^k) has no more real zeros than p(x). In this paper this problem is solved under the additional assumption of a weak growth condition on the sequence {μk} k=0...∞, namely lim n→∞ | μn |^(1/n) < ∞. More precisely, it is established that the real sequence {μk} k≥0 is a weakly increasing zerodiminishing sequence if and only if there exists σ ∈ {+1,−1} and an entire function n≥1, Φ(z)= be^(az) ∏(1+ x/αn), a, b ∈ R^1, b =0, αn > 0 ∀n ≥ 1, ∑(1/αn) < ∞, such that µk = (σ^k)/Φ(k), ∀k ≥ 0.
Resumo:
This paper describes a method of signal preprocessing under active monitoring. Suppose we want to solve the inverse problem of getting the response of a medium to one powerful signal, which is equivalent to obtaining the transmission function of the medium, but do not have an opportunity to conduct such an experiment (it might be too expensive or harmful for the environment). Practically the problem can be reduced to obtaining the transmission function of the medium. In this case we can conduct a series of experiments of relatively low power and superpose the response signals. However, this method is conjugated with considerable loss of information (especially in the high frequency domain) due to fluctuations of the phase, the frequency and the starting time of each individual experiment. The preprocessing technique presented in this paper allows us to substantially restore the response of the medium and consequently to find a better estimate for the transmission function. This technique is based on expanding the initial signal into the system of orthogonal functions.
Resumo:
Estimates Calculating Algorithms have a long story of application to recognition problems. Furthermore they have formed a basis for algebraic recognition theory. Yet use of ECA polynomials was limited to theoretical reasoning because of complexity of their construction and optimization. The new recognition method “AVO- polynom” based upon ECA polynomial of simple structure is described.
Resumo:
AMS Subj. Classification: 65D07, 65D30.
Resumo:
Михаил Константинов, Весела Пашева, Петко Петков - Разгледани са някои числени проблеми при използването на компютърната система MATLAB в учебната дейност: пресмятане на тригонометрични функции, повдигане на матрица на степен, спектрален анализ на целочислени матрици от нисък ред и пресмятане на корените на алгебрични уравнения. Причините за възникналите числени трудности могат да се обяснят с особеностите на използваната двоичната аритметика с плаваща точка.
Resumo:
In 1900 E. B. Van Vleck proposed a very efficient method to compute the Sturm sequence of a polynomial p (x) ∈ Z[x] by triangularizing one of Sylvester’s matrices of p (x) and its derivative p′(x). That method works fine only for the case of complete sequences provided no pivots take place. In 1917, A. J. Pell and R. L. Gordon pointed out this “weakness” in Van Vleck’s theorem, rectified it but did not extend his method, so that it also works in the cases of: (a) complete Sturm sequences with pivot, and (b) incomplete Sturm sequences. Despite its importance, the Pell-Gordon Theorem for polynomials in Q[x] has been totally forgotten and, to our knowledge, it is referenced by us for the first time in the literature. In this paper we go over Van Vleck’s theorem and method, modify slightly the formula of the Pell-Gordon Theorem and present a general triangularization method, called the VanVleck-Pell-Gordon method, that correctly computes in Z[x] polynomial Sturm sequences, both complete and incomplete.
Resumo:
ACM Computing Classification System (1998): F.2.1, G.1.5, I.1.2.
Resumo:
In the last decades, analogue modelling has been used in geology to improve the knowledge of how geological structures are nucleated, how they grow and what are the main important points in such processes. The use of this tool in the oil industry, to help seismic interpretations and mainly to search for structural traps contributed to disseminate the use of this tool in the literature. Nowadays, physical modelling has a large field of applications, since landslide to granite emplacement along shear zones. In this work, we deal with physical modelling to study the influence of mechanical stratifications in the nucleation and development of faults and fractures in a context of orthogonal and conjugated oblique basins. To simulate a mechanical stratigraphy we used different materials, with distinct physical proprieties, such as gypsum powder, glass beads, dry clay and quartz sand. Some experiments were run along with a PIV (Particle Image Velocimetry), an instrument that shows the movement of the particles to each deformation moment. Two series of experiments were studied: i) Series MO: We tested the development of normal faults in a context of an orthogonal (to the extension direction) basin. Experiments were run taking into account the change of materials and strata thickness. Some experiments were done with sintectonic sedimentation. We registered differences in the nucleation and growth of faults in layers with different rheological behavior. The gypsum powder layer behaves in a more competent mode, which generates a great number of high angle fractures. These fractures evolve to faults that exhibit a higher dip than when they cross less competent layers, like the one of quartz sand. This competent layer exhibits faulted blocks arranged in a typical domino-style. Cataclastic breccias developed along the faults affecting the competent layers and showed different evolutional history, depending on the deforming stratigraphic sequence; ii) Series MOS2: Normal faults were analyzed in conjugated sub-basins (oblique to the extension direction) developed in a sequence with and without rheological contrast. In experiments with rheological contrast, two important grabens developed along the faulted margins differing from the subbasins with mechanical stratigraphy. Both experiments developed oblique fault systems and, in the area of sub-basins intersection, faults traces became very curved.
Resumo:
This research work aims to make a study of the algebraic theory of matrix monic polynomials, as well as the definitions, concepts and properties with respect to block eigenvalues, block eigenvectors and solvents of P(X). We investigte the main relations between the matrix polynomial and the Companion and Vandermonde matrices. We study the construction of matrix polynomials with certain solvents and the extention of the Power Method, to calculate block eigenvalues and solvents of P(X). Through the relationship between the dominant block eigenvalue of the Companion matrix and the dominant solvent of P(X) it is possible to obtain the convergence of the algorithm for the dominant solvent of the matrix polynomial. We illustrate with numerical examples for diferent cases of convergence.
Resumo:
We consider Sklyanin algebras $S$ with 3 generators, which are quadratic algebras over a field $\K$ with $3$ generators $x,y,z$ given by $3$ relations $pxy+qyx+rzz=0$, $pyz+qzy+rxx=0$ and $pzx+qxz+ryy=0$, where $p,q,r\in\K$. this class of algebras has enjoyed much attention. In particular, using tools from algebraic geometry, Feigin, Odesskii \cite{odf}, and Artin, Tate and Van Den Bergh, showed that if at least two of the parameters $p$, $q$ and $r$ are non-zero and at least two of three numbers $p^3$, $q^3$ and $r^3$ are distinct, then $S$ is Artin--Schelter regular. More specifically, $S$ is Koszul and has the same Hilbert series as the algebra of commutative polynomials in 3 indeterminates (PHS). It has became commonly accepted that it is impossible to achieve the same objective by purely algebraic and combinatorial means like the Groebner basis technique. The main purpose of this paper is to trace the combinatorial meaning of the properties of Sklyanin algebras, such as Koszulity, PBW, PHS, Calabi-Yau, and to give a new constructive proof of the above facts due to Artin, Tate and Van Den Bergh. Further, we study a wider class of Sklyanin algebras, namely
the situation when all parameters of relations could be different. We call them generalized Sklyanin algebras. We classify up to isomorphism all generalized Sklyanin algebras with the same Hilbert series as commutative polynomials on
3 variables. We show that generalized Sklyanin algebras in general position have a Golod–Shafarevich Hilbert series (with exception of the case of field with two elements).
Resumo:
Applications are subject of a continuous evolution process with a profound impact on their underlining data model, hence requiring frequent updates in the applications' class structure and database structure as well. This twofold problem, schema evolution and instance adaptation, usually known as database evolution, is addressed in this thesis. Additionally, we address concurrency and error recovery problems with a novel meta-model and its aspect-oriented implementation. Modern object-oriented databases provide features that help programmers deal with object persistence, as well as all related problems such as database evolution, concurrency and error handling. In most systems there are transparent mechanisms to address these problems, nonetheless the database evolution problem still requires some human intervention, which consumes much of programmers' and database administrators' work effort. Earlier research works have demonstrated that aspect-oriented programming (AOP) techniques enable the development of flexible and pluggable systems. In these earlier works, the schema evolution and the instance adaptation problems were addressed as database management concerns. However, none of this research was focused on orthogonal persistent systems. We argue that AOP techniques are well suited to address these problems in orthogonal persistent systems. Regarding the concurrency and error recovery, earlier research showed that only syntactic obliviousness between the base program and aspects is possible. Our meta-model and framework follow an aspect-oriented approach focused on the object-oriented orthogonal persistent context. The proposed meta-model is characterized by its simplicity in order to achieve efficient and transparent database evolution mechanisms. Our meta-model supports multiple versions of a class structure by applying a class versioning strategy. Thus, enabling bidirectional application compatibility among versions of each class structure. That is to say, the database structure can be updated because earlier applications continue to work, as well as later applications that have only known the updated class structure. The specific characteristics of orthogonal persistent systems, as well as a metadata enrichment strategy within the application's source code, complete the inception of the meta-model and have motivated our research work. To test the feasibility of the approach, a prototype was developed. Our prototype is a framework that mediates the interaction between applications and the database, providing them with orthogonal persistence mechanisms. These mechanisms are introduced into applications as an {\it aspect} in the aspect-oriented sense. Objects do not require the extension of any super class, the implementation of an interface nor contain a particular annotation. Parametric type classes are also correctly handled by our framework. However, classes that belong to the programming environment must not be handled as versionable due to restrictions imposed by the Java Virtual Machine. Regarding concurrency support, the framework provides the applications with a multithreaded environment which supports database transactions and error recovery. The framework keeps applications oblivious to the database evolution problem, as well as persistence. Programmers can update the applications' class structure because the framework will produce a new version for it at the database metadata layer. Using our XML based pointcut/advice constructs, the framework's instance adaptation mechanism is extended, hence keeping the framework also oblivious to this problem. The potential developing gains provided by the prototype were benchmarked. In our case study, the results confirm that mechanisms' transparency has positive repercussions on the programmer's productivity, simplifying the entire evolution process at application and database levels. The meta-model itself also was benchmarked in terms of complexity and agility. Compared with other meta-models, it requires less meta-object modifications in each schema evolution step. Other types of tests were carried out in order to validate prototype and meta-model robustness. In order to perform these tests, we used an OO7 small size database due to its data model complexity. Since the developed prototype offers some features that were not observed in other known systems, performance benchmarks were not possible. However, the developed benchmark is now available to perform future performance comparisons with equivalent systems. In order to test our approach in a real world scenario, we developed a proof-of-concept application. This application was developed without any persistence mechanisms. Using our framework and minor changes applied to the application's source code, we added these mechanisms. Furthermore, we tested the application in a schema evolution scenario. This real world experience using our framework showed that applications remains oblivious to persistence and database evolution. In this case study, our framework proved to be a useful tool for programmers and database administrators. Performance issues and the single Java Virtual Machine concurrent model are the major limitations found in the framework.
Resumo:
The introduction of delays into ordinary or partial differential equation models is well known to facilitate the production of rich dynamics ranging from periodic solutions through to spatio-temporal chaos. In this paper we consider a class of scalar partial differential equations with a delayed threshold nonlinearity which admits exact solutions for equilibria, periodic orbits and travelling waves. Importantly we show how the spectra of periodic and travelling wave solutions can be determined in terms of the zeros of a complex analytic function. Using this as a computational tool to determine stability we show that delays can have very different effects on threshold systems with negative as opposed to positive feedback. Direct numerical simulations are used to confirm our bifurcation analysis, and to probe some of the rich behaviour possible for mixed feedback.