116 resultados para Monotonicity
Resumo:
Background: The evaluation of associations between genotypes and diseases in a case-control framework plays an important role in genetic epidemiology. This paper focuses on the evaluation of the homogeneity of both genotypic and allelic frequencies. The traditional test that is used to check allelic homogeneity is known to be valid only under Hardy-Weinberg equilibrium, a property that may not hold in practice. Results: We first describe the flaws of the traditional (chi-squared) tests for both allelic and genotypic homogeneity. Besides the known problem of the allelic procedure, we show that whenever these tests are used, an incoherence may arise: sometimes the genotypic homogeneity hypothesis is not rejected, but the allelic hypothesis is. As we argue, this is logically impossible. Some methods that were recently proposed implicitly rely on the idea that this does not happen. In an attempt to correct this incoherence, we describe an alternative frequentist approach that is appropriate even when Hardy-Weinberg equilibrium does not hold. It is then shown that the problem remains and is intrinsic of frequentist procedures. Finally, we introduce the Full Bayesian Significance Test to test both hypotheses and prove that the incoherence cannot happen with these new tests. To illustrate this, all five tests are applied to real and simulated datasets. Using the celebrated power analysis, we show that the Bayesian method is comparable to the frequentist one and has the advantage of being coherent. Conclusions: Contrary to more traditional approaches, the Full Bayesian Significance Test for association studies provides a simple, coherent and powerful tool for detecting associations.
Resumo:
In electrical impedance tomography, one tries to recover the conductivity inside a physical body from boundary measurements of current and voltage. In many practically important situations, the investigated object has known background conductivity but it is contaminated by inhomogeneities. The factorization method of Andreas Kirsch provides a tool for locating such inclusions. Earlier, it has been shown that under suitable regularity conditions positive (or negative) inhomogeneities can be characterized by the factorization technique if the conductivity or one of its higher normal derivatives jumps on the boundaries of the inclusions. In this work, we use a monotonicity argument to generalize these results: We show that the factorization method provides a characterization of an open inclusion (modulo its boundary) if each point inside the inhomogeneity has an open neighbourhood where the perturbation of the conductivity is strictly positive (or negative) definite. In particular, we do not assume any regularity of the inclusion boundary or set any conditions on the behaviour of the perturbed conductivity at the inclusion boundary. Our theoretical findings are verified by two-dimensional numerical experiments.
Resumo:
The study of dielectric properties concerns storage and dissipation of electric and magnetic energy in materials. Dielectrics are important in order to explain various phenomena in Solid-State Physics and in Physics of Biological Materials. Indeed, during the last two centuries, many scientists have tried to explain and model the dielectric relaxation. Starting from the Kohlrausch model and passing through the ideal Debye one, they arrived at more com- plex models that try to explain the experimentally observed distributions of relaxation times, including the classical (Cole-Cole, Davidson-Cole and Havriliak-Negami) and the more recent ones (Hilfer, Jonscher, Weron, etc.). The purpose of this thesis is to discuss a variety of models carrying out the analysis both in the frequency and in the time domain. Particular attention is devoted to the three classical models, that are studied using a transcendental function known as Mittag-Leffler function. We highlight that one of the most important properties of this function, its complete monotonicity, is an essential property for the physical acceptability and realizability of the models. Lo studio delle proprietà dielettriche riguarda l’immagazzinamento e la dissipazione di energia elettrica e magnetica nei materiali. I dielettrici sono importanti al fine di spiegare vari fenomeni nell’ambito della Fisica dello Stato Solido e della Fisica dei Materiali Biologici. Infatti, durante i due secoli passati, molti scienziati hanno tentato di spiegare e modellizzare il rilassamento dielettrico. A partire dal modello di Kohlrausch e passando attraverso quello ideale di Debye, sono giunti a modelli più complessi che tentano di spiegare la distribuzione osservata sperimentalmente di tempi di rilassamento, tra i quali modelli abbiamo quelli classici (Cole-Cole, Davidson-Cole e Havriliak-Negami) e quelli più recenti (Hilfer, Jonscher, Weron, etc.). L’obiettivo di questa tesi è discutere vari modelli, conducendo l’analisi sia nel dominio delle frequenze sia in quello dei tempi. Particolare attenzione è rivolta ai tre modelli classici, i quali sono studiati utilizzando una funzione trascendente nota come funzione di Mittag-Leffler. Evidenziamo come una delle più importanti proprietà di questa funzione, la sua completa monotonia, è una proprietà essenziale per l’accettabilità fisica e la realizzabilità dei modelli.
Resumo:
We investigate the interplay of smoothness and monotonicity assumptions when estimating a density from a sample of observations. The nonparametric maximum likelihood estimator of a decreasing density on the positive half line attains a rate of convergence at a fixed point if the density has a negative derivative. The same rate is obtained by a kernel estimator, but the limit distributions are different. If the density is both differentiable and known to be monotone, then a third estimator is obtained by isotonization of a kernel estimator. We show that this again attains the rate of convergence and compare the limit distributors of the three types of estimators. It is shown that both isotonization and smoothing lead to a more concentrated limit distribution and we study the dependence on the proportionality constant in the bandwidth. We also show that isotonization does not change the limit behavior of a kernel estimator with a larger bandwidth, in the case that the density is known to have more than one derivative.
Resumo:
Four papers, written in collaboration with the author’s graduate school advisor, are presented. In the first paper, uniform and non-uniform Berry-Esseen (BE) bounds on the convergence to normality of a general class of nonlinear statistics are provided; novel applications to specific statistics, including the non-central Student’s, Pearson’s, and the non-central Hotelling’s, are also stated. In the second paper, a BE bound on the rate of convergence of the F-statistic used in testing hypotheses from a general linear model is given. The third paper considers the asymptotic relative efficiency (ARE) between the Pearson, Spearman, and Kendall correlation statistics; conditions sufficient to ensure that the Spearman and Kendall statistics are equally (asymptotically) efficient are provided, and several models are considered which illustrate the use of such conditions. Lastly, the fourth paper proves that, in the bivariate normal model, the ARE between any of these correlation statistics possesses certain monotonicity properties; quadratic lower and upper bounds on the ARE are stated as direct applications of such monotonicity patterns.
Resumo:
We consider collective decision problems given by a profile of single-peaked preferences defined over the real line and a set of pure public facilities to be located on the line. In this context, Bochet and Gordon (2012) provide a large class of priority rules based on efficiency, object-population monotonicity and sovereignty. Each such rule is described by a fixed priority ordering among interest groups. We show that any priority rule which treats agents symmetrically — anonymity — respects some form of coherence across collective decision problems — reinforcement — and only depends on peak information — peakonly — is a weighted majoritarian rule. Each such rule defines priorities based on the relative size of the interest groups and specific weights attached to locations. We give an explicit account of the richness of this class of rules.
Resumo:
Fossil pollen data from stratigraphic cores are irregularly spaced in time due to non-linear age-depth relations. Moreover, their marginal distributions may vary over time. We address these features in a nonparametric regression model with errors that are monotone transformations of a latent continuous-time Gaussian process Z(T). Although Z(T) is unobserved, due to monotonicity, under suitable regularity conditions, it can be recovered facilitating further computations such as estimation of the long-memory parameter and the Hermite coefficients. The estimation of Z(T) itself involves estimation of the marginal distribution function of the regression errors. These issues are considered in proposing a plug-in algorithm for optimal bandwidth selection and construction of confidence bands for the trend function. Some high-resolution time series of pollen records from Lago di Origlio in Switzerland, which go back ca. 20,000 years are used to illustrate the methods.
Resumo:
We derive multiscale statistics for deconvolution in order to detect qualitative features of the unknown density. An important example covered within this framework is to test for local monotonicity on all scales simultaneously. We investigate the moderately ill-posed setting, where the Fourier transform of the error density in the deconvolution model is of polynomial decay. For multiscale testing, we consider a calibration, motivated by the modulus of continuity of Brownian motion. We investigate the performance of our results from both the theoretical and simulation based point of view. A major consequence of our work is that the detection of qualitative features of a density in a deconvolution problem is a doable task, although the minimax rates for pointwise estimation are very slow.
Resumo:
We consider the problem of nonparametric estimation of a concave regression function F. We show that the supremum distance between the least square s estimatorand F on a compact interval is typically of order(log(n)/n)2/5. This entails rates of convergence for the estimator’s derivative. Moreover, we discuss the impact of additional constraints on F such as monotonicity and pointwise bounds. Then we apply these results to the analysis of current status data, where the distribution function of the event times is assumed to be concave.
Resumo:
Let Y_i = f(x_i) + E_i\ (1\le i\le n) with given covariates x_1\lt x_2\lt \cdots\lt x_n , an unknown regression function f and independent random errors E_i with median zero. It is shown how to apply several linear rank test statistics simultaneously in order to test monotonicity of f in various regions and to identify its local extrema.
Resumo:
The Work Limitations Questionnaire (WLQ) is used to determine the amount of work loss and productivity which stem from certain health conditions, including rheumatoid arthritis and cancer. The questionnaire is currently scored using methodology from Classical Test Theory. Item Response Theory, on the other hand, is a theory based on analyzing item responses. This study wanted to determine the validity of using Item Response Theory (IRT), to analyze data from the WLQ. Item responses from 572 employed adults with dysthymia, major depressive disorder (MDD), double depressive disorder (both dysthymia and MDD), rheumatoid arthritis and healthy individuals were used to determine the validity of IRT (Adler et al., 2006).^ PARSCALE, which is IRT software from Scientific Software International, Inc., was used to calculate estimates of the work limitations based on item responses from the WLQ. These estimates, also known as ability estimates, were then correlated with the raw score estimates calculated from the sum of all the items responses. Concurrent validity, which claims a measurement is valid if the correlation between the new measurement and the valid measurement is greater or equal to .90, was used to determine the validity of IRT methodology for the WLQ. Ability estimates from IRT were found to be somewhat highly correlated with the raw scores from the WLQ (above .80). However, the only subscale which had a high enough correlation for IRT to be considered valid was the time management subscale (r = .90). All other subscales, mental/interpersonal, physical, and output, did not produce valid IRT ability estimates.^ An explanation for these lower than expected correlations can be explained by the outliers found in the sample. Also, acquiescent responding (AR) bias, which is caused by the tendency for people to respond the same way to every question on a questionnaire, and the multidimensionality of the questionnaire (the WLQ is composed of four dimensions and thus four different latent variables) probably had a major impact on the IRT estimates. Furthermore, it is possible that the mental/interpersonal dimension violated the monotonocity assumption of IRT causing PARSCALE to fail to run for these estimates. The monotonicity assumption needs to be checked for the mental/interpersonal dimension. Furthermore, the use of multidimensional IRT methods would most likely remove the AR bias and increase the validity of using IRT to analyze data from the WLQ.^
Resumo:
En este trabajo se han analizado varios problemas en el contexto de la elasticidad no lineal basándose en modelos constitutivos representativos. En particular, se han analizado problemas relacionados con el fenómeno de perdida de estabilidad asociada con condiciones de contorno en el caso de material reforzados con fibras. Cada problema se ha formulado y se ha analizado por separado en diferentes capítulos. En primer lugar se ha mostrado el análisis del gradiente de deformación discontinuo para un material transversalmente isótropo, en particular, el modelo del material considerado consiste de una base neo-Hookeana isótropa incrustada con fibras de refuerzo direccional caracterizadas con un solo parámetro. La solución de este problema se vincula con instabilidades que dan lugar al mecanismo de fallo conocido como banda de cortante. La perdida de elipticidad de las ecuaciones diferenciales de equilibrio es una condición necesaria para que aparezca este tipo de soluciones y por tanto las inestabilidades asociadas. En segundo lugar se ha analizado una deformación combinada de extensión, inación y torsión de un tubo cilíndrico grueso donde se ha encontrado que la deformación citada anteriormente puede ser controlada solo para determinadas direcciones de las fibras refuerzo. Para entender el comportamiento elástico del tubo considerado se ha ilustrado numéricamente los resultados obtenidos para las direcciones admisibles de las fibras de refuerzo bajo la deformación considerada. En tercer lugar se ha estudiado el caso de un tubo cilíndrico grueso reforzado con dos familias de fibras sometido a cortante en la dirección azimutal para un modelo de refuerzo especial. En este problema se ha encontrado que las inestabilidades que aparecen en el material considerado están asociadas con lo que se llama soluciones múltiples de la ecuación diferencial de equilibrio. Se ha encontrado que el fenómeno de instabilidad ocurre en un estado de deformación previo al estado de deformación donde se pierde la elipticidad de la ecuación diferencial de equilibrio. También se ha demostrado que la condición de perdida de elipticidad y ^W=2 = 0 (la segunda derivada de la función de energía con respecto a la deformación) son dos condiciones necesarias para la existencia de soluciones múltiples. Finalmente, se ha analizado detalladamente en el contexto de elipticidad un problema de un tubo cilíndrico grueso sometido a una deformación combinada en las direcciones helicoidal, axial y radial para distintas geotermias de las fibras de refuerzo . In the present work four main problems have been addressed within the framework of non-linear elasticity based on representative constitutive models. Namely, problems related to the loss of stability phenomena associated with boundary value problems for fibre-reinforced materials. Each of the considered problems is formulated and analysed separately in different chapters. We first start with the analysis of discontinuous deformation gradients for a transversely isotropic material under plane deformation. In particular, the material model is an augmented neo-Hookean base with a simple unidirectional reinforcement characterised by a single parameter. The solution of this problem is related to material instabilities and it is associated with a shear band-type failure mode. The loss of ellipticity of the governing differential equations is a necessary condition for the existence of these material instabilities. The second problem involves a detailed analysis of the combined non-linear extension, inflation and torsion of a thick-walled circular cylindrical tube where it has been found that the aforementioned deformation is controllable only for certain preferred directions of transverse isotropy. Numerical results have been illustrated to understand the elastic behaviour of the tube for the admissible preferred directions under the considered deformation. The third problem deals with the analysis of a doubly fibre-reinforced thickwalled circular cylindrical tube undergoing pure azimuthal shear for a special class of the reinforcing model where multiple non-smooth solutions emerge. The associated instability phenomena are found to occur prior to the point where the nominal stress tensor changes monotonicity in a particular direction. It has been also shown that the loss of ellipticity condition that arises from the equilibrium equation and ^W=2 = 0 (the second derivative of the strain-energy function with respect to the deformation) are equivalent necessary conditions for the emergence of multiple solutions for the considered material. Finally, a detailed analysis in the basis of the loss of ellipticity of the governing differential equations for a combined helical, axial and radial elastic deformations of a fibre-reinforced circular cylindrical tube is carried out.
Resumo:
In this paper, we axiomatically introduce fuzzy multi-measures on bounded lattices. In particular, we make a distinction between four different types of fuzzy set multi-measures on a universe X, considering both the usual or inverse real number ordering of this lattice and increasing or decreasing monotonicity with respect to the number of arguments. We provide results from which we can derive families of measures that hold for the applicable conditions in each case.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-08
Resumo:
The estimated parameters of output distance functions frequently violate the monotonicity, quasi-convexity and convexity constraints implied by economic theory, leading to estimated elasticities and shadow prices that are incorrectly signed, and ultimately to perverse conclusions concerning the effects of input and output changes on productivity growth and relative efficiency levels. We show how a Bayesian approach can be used to impose these constraints on the parameters of a translog output distance function. Implementing the approach involves the use of a Gibbs sampler with data augmentation. A Metropolis-Hastings algorithm is also used within the Gibbs to simulate observations from truncated pdfs. Our methods are developed for the case where panel data is available and technical inefficiency effects are assumed to be time-invariant. Two models-a fixed effects model and a random effects model-are developed and applied to panel data on 17 European railways. We observe significant changes in estimated elasticities and shadow price ratios when regularity restrictions are imposed. (c) 2004 Elsevier B.V. All rights reserved.