941 resultados para Bivariate Hermite polynomials
Resumo:
The present study focuses attention on defining certain measures of income inequality for the truncated distributions and characterization of probability distributions using the functional form of these measures, extension of some measures of inequality and stability to higher dimensions, characterization of bivariate models using the above concepts and estimation of some measures of inequality using the Bayesian techniques. The thesis defines certain measures of income inequality for the truncated distributions and studies the effect of truncation upon these measures. An important measure used in Reliability theory, to measure the stability of the component is the residual entropy function. This concept can advantageously used as a measure of inequality of truncated distributions. The geometric mean comes up as handy tool in the measurement of income inequality. The geometric vitality function being the geometric mean of the truncated random variable can be advantageously utilized to measure inequality of the truncated distributions. The study includes problem of estimation of the Lorenz curve, Gini-index and variance of logarithms for the Pareto distribution using Bayesian techniques.
Resumo:
The study deals with the distribution theory and applications of concomitants from the Morgenstern family of bivariate distributions.The Morgenstern system of distributions include all cumulative distributions of the form FX,Y(X,Y)=FX(X) FY(Y)[1+α(1-FX(X))(1-FY(Y))], -1≤α≤1.The system provides a very general expression of a bivariate distributions from which members can be derived by substituting expressions of any desired set of marginal distributions.It is a brief description of the basic distribution theory and a quick review of the existing literature.The Morgenstern family considered in the present study provides a very general expression of a bivariate distribution from which several members can be derived by substituting expressions of any desired set of marginal distributions.Order statistics play a very important role in statistical theory and practice and accordingly a remarkably large body of literature has been devoted to its study.It helps to develop special methods of statistical inference,which are valid with respect to a broad class of distributions.The present study deals with the general distribution theory of Mk, [r: m] and Mk, [r: m] from the Morgenstern family of distributions and discuss some applications in inference, estimation of the parameter of the marginal variable Y in the Morgestern type uniform distributions.
Resumo:
A new procedure for the classification of lower case English language characters is presented in this work . The character image is binarised and the binary image is further grouped into sixteen smaller areas ,called Cells . Each cell is assigned a name depending upon the contour present in the cell and occupancy of the image contour in the cell. A data reduction procedure called Filtering is adopted to eliminate undesirable redundant information for reducing complexity during further processing steps . The filtered data is fed into a primitive extractor where extraction of primitives is done . Syntactic methods are employed for the classification of the character . A decision tree is used for the interaction of the various components in the scheme . 1ike the primitive extraction and character recognition. A character is recognized by the primitive by primitive construction of its description . Openended inventories are used for including variants of the characters and also adding new members to the general class . Computer implementation of the proposal is discussed at the end using handwritten character samples . Results are analyzed and suggestions for future studies are made. The advantages of the proposal are discussed in detail .
Resumo:
In this thesis, the concept of reversed lack of memory property and its generalizations is studied.We we generalize this property which involves operations different than the ”addition”. In particular an associative, binary operator ” * ” is considered. The univariate reversed lack of memory property is generalized using the binary operator and a class of probability distributions which include Type 3 extreme value, power function, reflected Weibull and negative Pareto distributions are characterized (Asha and Rejeesh (2009)). We also define the almost reversed lack of memory property and considered the distributions with reversed periodic hazard rate under the binary operation. Further, we give a bivariate extension of the generalized reversed lack of memory property and characterize a class of bivariate distributions which include the characterized extension (CE) model of Roy (2002a) apart from the bivariate reflected Weibull and power function distributions. We proved the equality of local proportionality of the reversed hazard rate and generalized reversed lack of memory property. Study of uncertainty is a subject of interest common to reliability, survival analysis, actuary, economics, business and many other fields. However, in many realistic situations, uncertainty is not necessarily related to the future but can also refer to the past. Recently, Di Crescenzo and Longobardi (2009) introduced a new measure of information called dynamic cumulative entropy. Dynamic cumulative entropy is suitable to measure information when uncertainty is related to the past, a dual concept of the cumulative residual entropy which relates to uncertainty of the future lifetime of a system. We redefine this measure in the whole real line and study its properties. We also discuss the implications of generalized reversed lack of memory property on dynamic cumulative entropy and past entropy.In this study, we extend the idea of reversed lack of memory property to the discrete set up. Here we investigate the discrete class of distributions characterized by the discrete reversed lack of memory property. The concept is extended to the bivariate case and bivariate distributions characterized by this property are also presented. The implication of this property on discrete reversed hazard rate, mean past life, and discrete past entropy are also investigated.
Resumo:
This thesis Entitled “modelling and analysis of recurrent event data with multiple causes.Survival data is a term used for describing data that measures the time to occurrence of an event.In survival studies, the time to occurrence of an event is generally referred to as lifetime.Recurrent event data are commonly encountered in longitudinal studies when individuals are followed to observe the repeated occurrences of certain events. In many practical situations, individuals under study are exposed to the failure due to more than one causes and the eventual failure can be attributed to exactly one of these causes.The proposed model was useful in real life situations to study the effect of covariates on recurrences of certain events due to different causes.In Chapter 3, an additive hazards model for gap time distributions of recurrent event data with multiple causes was introduced. The parameter estimation and asymptotic properties were discussed .In Chapter 4, a shared frailty model for the analysis of bivariate competing risks data was presented and the estimation procedures for shared gamma frailty model, without covariates and with covariates, using EM algorithm were discussed. In Chapter 6, two nonparametric estimators for bivariate survivor function of paired recurrent event data were developed. The asymptotic properties of the estimators were studied. The proposed estimators were applied to a real life data set. Simulation studies were carried out to find the efficiency of the proposed estimators.
Resumo:
This thesis is an outcome of the studies, carried out by the author on the Equatorial Undercurrent and the Equatorial Jet, an interesting and unique phenomenon discovered, recently, in the Indian Ocean (wyrtxi, 1973). The main objective of the thesis is to carry out a detailed investigation of the seasonal, latitudinal and longitudinal variation of the Equatorial Undercurrent in the Indian Ocean and also the Equatorial Jet, through mapping the vertical distribution of the oceanographic properties across the equator along various longitudes for all the months of an year, between SON and SOS, utilising the oceanographic data collected during the International Indian Ocean Expedition and subsequently in the equatorial Indian Ocean. As the distribution of the hydrographic properties give only a qualitative identification of the Undercurrent, a novel technique of computing the zonal flux through bivariate distribution of salinity and thermosteric anomaly introduced by Montgomery and Stroup (1962), is adopted in order to have a quantitative variation of the Equatorial Undercurrent and the Equatorial Jet. Finally, an attempt is made to give a plausible explanation of the features observed.
Resumo:
The present work is organized into six chapters. Bivariate extension of Burr system is the subject matter of Chapter II. The author proposes to introduce a general structure for the family in two dimensions and present some properties of such a system. Also in Chapter II some new distributions, which are bivariate extension of univariate distributions in Burr (1942) is presented.. In Chapter III, concentrates on characterization problems of different forms of bivariate Burr system. A detailed study of the distributional properties of each member of the Burr system has not been undertaken in literature. With this aim in mind in Chapter IV is discussed with two forms of bivariate Burr III distribution. In Chapter V the author Considers the type XII, type II and type IX distributions. Present work concludes with Chapter VI by pointing out the multivariate extension for Burr system. Also in this chapter the concept of multivariate reversed hazard rates as scalar and vector quantity is introduced.
Resumo:
Recently, cumulative residual entropy (CRE) has been found to be a new measure of information that parallels Shannon’s entropy (see Rao et al. [Cumulative residual entropy: A new measure of information, IEEE Trans. Inform. Theory. 50(6) (2004), pp. 1220–1228] and Asadi and Zohrevand [On the dynamic cumulative residual entropy, J. Stat. Plann. Inference 137 (2007), pp. 1931–1941]). Motivated by this finding, in this paper, we introduce a generalized measure of it, namely cumulative residual Renyi’s entropy, and study its properties.We also examine it in relation to some applied problems such as weighted and equilibrium models. Finally, we extend this measure into the bivariate set-up and prove certain characterizing relationships to identify different bivariate lifetime models
Resumo:
In this paper, we examine the relationships between log odds rate and various reliability measures such as hazard rate and reversed hazard rate in the context of repairable systems. We also prove characterization theorems for some families of distributions viz. Burr, Pearson and log exponential models. We discuss the properties and applications of log odds rate in weighted models. Further we extend the concept to the bivariate set up and study its properties.
Resumo:
Recently, reciprocal subtangent has been used as a useful tool to describe the behaviour of a density curve. Motivated by this, in the present article we extend the concept to the weighted models. Characterization results are proved for models viz. gamma, Rayleigh, equilibrium, residual lifetime, and proportional hazards. An identity under weighted distribution is also obtained when the reciprocal subtangent takes the form of a general class of distributions. Finally, an extension of reciprocal subtangent for the weighted models in the bivariate and multivariate cases are introduced and proved some useful results
Resumo:
A recurrent iterated function system (RIFS) is a genaralization of an IFS and provides nonself-affine fractal sets which are closer to natural objects. In general, it's attractor is not a continuous surface in R3. A recurrent fractal interpolation surface (RFIS) is an attractor of RIFS which is a graph of bivariate continuous interpolation function. We introduce a general method of generating recurrent interpolation surface which are at- tractors of RIFSs about any data set on a grid.
Resumo:
It is well known that Stickelberger-Swan theorem is very important for determining reducibility of polynomials over a binary field. Using this theorem it was determined the parity of the number of irreducible factors for some kinds of polynomials over a binary field, for instance, trinomials, tetranomials, self-reciprocal polynomials and so on. We discuss this problem for type II pentanomials namely x^m +x^{n+2} +x^{n+1} +x^n +1 \in\ IF_2 [x]. Such pentanomials can be used for efficient implementing multiplication in finite fields of characteristic two. Based on the computation of discriminant of these pentanomials with integer coefficients, it will be characterized the parity of the number of irreducible factors over IF_2 and be established the necessary conditions for the existence of this kind of irreducible pentanomials.
Resumo:
In der Arbeit werden zunächst die wesentlichsten Fakten über Schiefpolynome wiederholt, der Fokus liegt dabei auf Shift- und q-Shift-Operatoren in Charakteristik Null. Alle für die Arithmetik mit diesen Objekten notwendigen Konzepte und Algorithmen finden sich im ersten Kapitel. Einige der zur Bestimmung von Lösungen notwendigen Daten können aus dem Newtonpolygon, einer den Operatoren zugeordneten geometrischen Figur, abgelesen werden. Die Herleitung dieser Zusammenhänge ist das Thema des zweiten Kapitels der Arbeit, wobei dies insbesondere im q-Shift-Fall in dieser Form neu ist. Das dritte Kapitel beschäftigt sich mit der Bestimmung polynomieller und rationaler Lösungen dieser Operatoren, dabei folgt es im Wesentlichen der Darstellung von Mark van Hoeij. Der für die Faktorisierung von (q-)Shift Operatoren interessanteste Fall sind die sogenannten (q-)hypergeometrischen Lösungen, die direkt zu Rechtsfaktoren erster Ordnung korrespondieren. Im vierten Kapitel wird der van Hoeij-Algorithmus vom Shift- auf den q-Shift-Fall übertragen. Außerdem wird eine deutliche Verbesserung des q-Petkovsek-Algorithmus mit Hilfe der Daten des Newtonpolygons hergeleitet. Das fünfte Kapitel widmet sich der Berechnung allgemeiner Faktoren, wozu zunächst der adjungierte Operator eingeführt wird, der die Berechnung von Linksfaktoren erlaubt. Dann wird ein Algorithmus zur Berechnung von Rechtsfaktoren beliebiger Ordnung dargestellt. Für die praktische Benutzung ist dies allerdings für höhere Ordnungen unpraktikabel. Bei fast allen vorgestellten Algorithmen tritt das Lösen linearer Gleichungssysteme über rationalen Funktionenkörpern als Zwischenschritt auf. Dies ist in den meisten Computeralgebrasystemen nicht befriedigend gelöst. Aus diesem Grund wird im letzten Kapitel ein auf Evaluation und Interpolation basierender Algorithmus zur Lösung dieses Problems vorgestellt, der in allen getesteten Systemen den Standard-Algorithmen deutlich überlegen ist. Alle Algorithmen der Arbeit sind in einem MuPAD-Package implementiert, das der Arbeit beiliegt und eine komfortable Handhabung der auftretenden Objekte erlaubt. Mit diesem Paket können in MuPAD nun viele Probleme gelöst werden, für die es vorher keine Funktionen gab.