939 resultados para Transform statistics


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents the application of wavelet processing in the domain of handwritten character recognition. To attain high recognition rate, robust feature extractors and powerful classifiers that are invariant to degree of variability of human writing are needed. The proposed scheme consists of two stages: a feature extraction stage, which is based on Haar wavelet transform and a classification stage that uses support vector machine classifier. Experimental results show that the proposed method is effective

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Partial moments are extensively used in actuarial science for the analysis of risks. Since the first order partial moments provide the expected loss in a stop-loss treaty with infinite cover as a function of priority, it is referred as the stop-loss transform. In the present work, we discuss distributional and geometric properties of the first and second order partial moments defined in terms of quantile function. Relationships of the scaled stop-loss transform curve with the Lorenz, Gini, Bonferroni and Leinkuhler curves are developed

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper compares the most common digital signal processing methods of exon prediction in eukaryotes, and also proposes a technique for noise suppression in exon prediction. The specimen used here which has relevance in medical research, has been taken from the public genomic database - GenBank.Here exon prediction has been done using the digital signal processing methods viz. binary method, EIIP (electron-ion interaction psuedopotential) method and filter methods. Under filter method two filter designs, and two approaches using these two designs have been tried. The discrete wavelet transform has been used for de-noising of the exon plots.Results of exon prediction based on the methods mentioned above, which give values closest to the ones found in the NCBI database are given here. The exon plot de-noised using discrete wavelet transform is also given.Alterations to the proven methods as done by the authors, improves performance of exon prediction algorithms. Also it has been proven that the discrete wavelet transform is an effective tool for de-noising which can be used with exon prediction algorithms

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The problem of using information available from one variable X to make inferenceabout another Y is classical in many physical and social sciences. In statistics this isoften done via regression analysis where mean response is used to model the data. Onestipulates the model Y = µ(X) +ɛ. Here µ(X) is the mean response at the predictor variable value X = x, and ɛ = Y - µ(X) is the error. In classical regression analysis, both (X; Y ) are observable and one then proceeds to make inference about the mean response function µ(X). In practice there are numerous examples where X is not available, but a variable Z is observed which provides an estimate of X. As an example, consider the herbicidestudy of Rudemo, et al. [3] in which a nominal measured amount Z of herbicide was applied to a plant but the actual amount absorbed by the plant X is unobservable. As another example, from Wang [5], an epidemiologist studies the severity of a lung disease, Y , among the residents in a city in relation to the amount of certain air pollutants. The amount of the air pollutants Z can be measured at certain observation stations in the city, but the actual exposure of the residents to the pollutants, X, is unobservable and may vary randomly from the Z-values. In both cases X = Z+error: This is the so called Berkson measurement error model.In more classical measurement error model one observes an unbiased estimator W of X and stipulates the relation W = X + error: An example of this model occurs when assessing effect of nutrition X on a disease. Measuring nutrition intake precisely within 24 hours is almost impossible. There are many similar examples in agricultural or medical studies, see e.g., Carroll, Ruppert and Stefanski [1] and Fuller [2], , among others. In this talk we shall address the question of fitting a parametric model to the re-gression function µ(X) in the Berkson measurement error model: Y = µ(X) + ɛ; X = Z + η; where η and ɛ are random errors with E(ɛ) = 0, X and η are d-dimensional, and Z is the observable d-dimensional r.v.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Humans distinguish materials such as metal, plastic, and paper effortlessly at a glance. Traditional computer vision systems cannot solve this problem at all. Recognizing surface reflectance properties from a single photograph is difficult because the observed image depends heavily on the amount of light incident from every direction. A mirrored sphere, for example, produces a different image in every environment. To make matters worse, two surfaces with different reflectance properties could produce identical images. The mirrored sphere simply reflects its surroundings, so in the right artificial setting, it could mimic the appearance of a matte ping-pong ball. Yet, humans possess an intuitive sense of what materials typically "look like" in the real world. This thesis develops computational algorithms with a similar ability to recognize reflectance properties from photographs under unknown, real-world illumination conditions. Real-world illumination is complex, with light typically incident on a surface from every direction. We find, however, that real-world illumination patterns are not arbitrary. They exhibit highly predictable spatial structure, which we describe largely in the wavelet domain. Although they differ in several respects from the typical photographs, illumination patterns share much of the regularity described in the natural image statistics literature. These properties of real-world illumination lead to predictable image statistics for a surface with given reflectance properties. We construct a system that classifies a surface according to its reflectance from a single photograph under unknown illuminination. Our algorithm learns relationships between surface reflectance and certain statistics computed from the observed image. Like the human visual system, we solve the otherwise underconstrained inverse problem of reflectance estimation by taking advantage of the statistical regularity of illumination. For surfaces with homogeneous reflectance properties and known geometry, our system rivals human performance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

One of the disadvantages of old age is that there is more past than future: this, however, may be turned into an advantage if the wealth of experience and, hopefully, wisdom gained in the past can be reflected upon and throw some light on possible future trends. To an extent, then, this talk is necessarily personal, certainly nostalgic, but also self critical and inquisitive about our understanding of the discipline of statistics. A number of almost philosophical themes will run through the talk: search for appropriate modelling in relation to the real problem envisaged, emphasis on sensible balances between simplicity and complexity, the relative roles of theory and practice, the nature of communication of inferential ideas to the statistical layman, the inter-related roles of teaching, consultation and research. A list of keywords might be: identification of sample space and its mathematical structure, choices between transform and stay, the role of parametric modelling, the role of a sample space metric, the underused hypothesis lattice, the nature of compositional change, particularly in relation to the modelling of processes. While the main theme will be relevance to compositional data analysis we shall point to substantial implications for general multivariate analysis arising from experience of the development of compositional data analysis…

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Lean is common sense and good business sense. As organizations grow and become more successful, they begin to lose insight into the basic truths of what made them successful. Organizations have to deal with more and more issues that may not have anything to do with directly providing products or services to their customers. Lean is a holistic management approach that brings the focus of the organization back to providing value to the customer. In August 2002, Mrs. Darleen Druyun, the Principal Deputy to the Assistant Secretary of the Air Force for Acquisition and government co-chairperson of the Lean Aerospace Initiative (LAI), decided it was time for Air Force acquisitions to embrace the concepts of lean. At her request, the LAI Executive Board developed a concept and methodology to employ lean into the Air Force’s acquisition culture and processes. This was the birth of the “Lean Now” initiative. An enterprise-wide approach was used, involving Air Force System Program Offices (SPOs), aerospace industry, and several Department of Defense agencies. The aim of Lean Now was to focus on the process interfaces between these “enterprise” stakeholders to eliminate barriers that impede progress. Any best practices developed would be institutionalized throughout the Air Force and the Department of Defense (DoD). The industry members of LAI agreed to help accelerate the government-industry transformation by donating lean Subject Matter Experts (SMEs) to mentor, train, and facilitate the lean events of each enterprise. Currently, the industry SMEs and the Massachusetts Institute of Technology are working together to help the Air Force develop its own lean infrastructure of training courses and Air Force lean SMEs. The first Lean Now programs were the F/A-22, Global Hawk, and F-16. Each program focused on specific acquisition processes. The F/A-22 focused on the Test and Evaluation process; the Global Hawk focused on Evolutionary Acquisitions; and the F-16 focused on improving the Contract Closeout process. Through lean, each enterprise made many significant improvements. The F/A-22 was able to reduce its Operational Flight Plan (OFP) Preparation and Load process time of 2 to 3 months down to 7 hours. The Global Hawk developed a new production plan that increases the annual production of its Integrated Sensor Suite from 3 per year to 6 per year. The F-16 enterprise generated and is working 12 initiatives that could result in a contract closeout cycle time reduction of 3 to 7 years. Each enterprise continues to generate more lean initiatives that focus on other areas and processes within their respective enterprises.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

What test for your data? SPSS Statistics Coach can help

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Exercises and solutions for an introductory statistics course for MSc students. Diagrams for the questions are all together in the support.zip file, as .eps files

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Relates to the following software for analysing Blackboard stats http://www.edshare.soton.ac.uk/11134/ Is supporting material for the following podcast: http://youtu.be/yHxCzjiYBoU

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A través de los años y en busca de ser partícipes del fenómeno de la globalización y de la integración económica entre países, Colombia ha buscado establecer acuerdos comerciales estratégicamente adecuados para transformar la economía interna, expandirse a nivel internacional e intercambiar productos en los que no se es especializado. Ante esto, Colombia firmó el Tratado de Libre Comercio con la Unión Europea en Junio del 2012, que entró en vigencia en Agosto del 2013, y hasta el momento la situación del sector lechero ha sido crítica, pues carece del nivel de competitividad suficiente para garantizar la perdurabilidad de los pequeños y medianos productores de leche. El diagnóstico del sector lechero colombiano permite tener una visión más clara sobre las principales características de los sistemas de producción del país, es aquí donde se conocen las falencias que se presentan en el sector, pues la productividad es baja, los costos de producción son altos y los precios tienen una tendencia al alza, colocando a los productos nacionales en desventaja frente a la gran oferta internacional que invade el mercado interno con precios reducidos y con mejor calidad. Además, se identifica también que gran parte de los productores de leche son informales y por lo tanto es difícil consolidar la información referente al sector y convertirla en conocimiento para efectuar planes de cambio y transformación para mejorar la situación. Del lado contrario y actuando como principal competidor en este caso de estudio, se encuentra la Unión Europea. Su diagnóstico ubica esta zona geográfica en una posición indudablemente privilegiada: precios bajos, mejor calidad, alta productividad, entre otras características que hacen de la UE una verdadera amenaza para los pequeños y medianos productores colombianos. La sobreproducción que se presencia en esta asociación económica y política crea la necesidad de explorar otros mercados para poner a disposición dicha producción, y el Tratado de Libre Comercio con Colombia es una opción para compensar esa oferta con una demanda insatisfecha. Con el fin de conocer profundamente las implicaciones del TLC con la Unión Europea y cómo afecta este acuerdo a los pequeños y medianos productores de leche en Colombia, fue necesario abordar algunos puntos clave en la negociación establecida y a partir de estos determinar si este tratado es realmente una oportunidad para el campesino informal que se dedica a la producción de leche para el autoconsumo y/o la comercialización en veredas y pueblos, o por el contrario, es una plataforma que afecta negativamente a las 450 mil familias que viven de este oficio y que no tienen los niveles de competitividad exigidos para enfrentar una competencia extranjera tan fuerte como lo es la Unión Europea. Por último, el análisis DOFA permite crear estrategias coherentes y viables relacionando las fortalezas y debilidades de Colombia con las oportunidades y amenazas que implica tener un acuerdo de libre comercio con la Unión Europea. Estas estrategias contribuyen a mejorar la competitividad del sector en busca de garantizar una mayor perdurabilidad de los pequeños y medianos productores de leche, y de esta manera se logrará no sólo aprovechar este acuerdo comercial, si no también conquistar otros mercados internacionales con un producto de mejor calidad.