898 resultados para 340402 Econometric and Statistical Methods


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This project aims to apply image processing techniques in computer vision featuring an omnidirectional vision system to agricultural mobile robots (AMR) used for trajectory navigation problems, as well as localization matters. To carry through this task, computational methods based on the JSEG algorithm were used to provide the classification and the characterization of such problems, together with Artificial Neural Networks (ANN) for pattern recognition. Therefore, it was possible to run simulations and carry out analyses of the performance of JSEG image segmentation technique through Matlab/Octave platforms, along with the application of customized Back-propagation algorithm and statistical methods in a Simulink environment. Having the aforementioned procedures been done, it was practicable to classify and also characterize the HSV space color segments, not to mention allow the recognition of patterns in which reasonably accurate results were obtained.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Head and neck tumors are a major health concern worldwide, due to their high incidence and mortality rates, particularly in developing countries. In Brazil, this type of cancer is commonly diagnosed and studies suggested that it may be the leading cause of mortality in the country. The increase in life expectancy worldwide, as well as environmental and behavioral factors, are related to carcinogenesis. Therefore, an understanding of basic epidemiology and statistical methods is critical, in order to promote early diagnosis and cancer prevention. Cancer patients with an indication for prosthesis were selected from the medical records of the Oral Oncology Center, School of Dentistry, São Paulo State University (UNESP), Araçatuba, between 1991 and 2010. The following variables were recorded: gender, age, type and location of the lesion, radiation dose and dental prosthesis. The majority of the patients were male (74.15%) and >60 years of age (53.37%). Tumors were most commonly located in the floor of the mouth (11.1%) and squamous cell carcinoma was the most prevalent type (72.8%). This study provides the profiles of patients who attended the Oral Oncology Center and the results may aid in the creation of cancer prevention programs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract Background Several mathematical and statistical methods have been proposed in the last few years to analyze microarray data. Most of those methods involve complicated formulas, and software implementations that require advanced computer programming skills. Researchers from other areas may experience difficulties when they attempting to use those methods in their research. Here we present an user-friendly toolbox which allows large-scale gene expression analysis to be carried out by biomedical researchers with limited programming skills. Results Here, we introduce an user-friendly toolbox called GEDI (Gene Expression Data Interpreter), an extensible, open-source, and freely-available tool that we believe will be useful to a wide range of laboratories, and to researchers with no background in Mathematics and Computer Science, allowing them to analyze their own data by applying both classical and advanced approaches developed and recently published by Fujita et al. Conclusion GEDI is an integrated user-friendly viewer that combines the state of the art SVR, DVAR and SVAR algorithms, previously developed by us. It facilitates the application of SVR, DVAR and SVAR, further than the mathematical formulas present in the corresponding publications, and allows one to better understand the results by means of available visualizations. Both running the statistical methods and visualizing the results are carried out within the graphical user interface, rendering these algorithms accessible to the broad community of researchers in Molecular Biology.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of this study was to develop a criteria catalogue serving as a guideline for authors to improve quality of reporting experiments in basic research in homeopathy. A Delphi Process was initiated including three rounds of adjusting and phrasing plus two consensus conferences. European researchers who published experimental work within the last 5 years were involved. A checklist for authors provide a catalogue with 23 criteria. The “Introduction” should focus on underlying hypotheses, the homeopathic principle investigated and state if experiments are exploratory or confirmatory. “Materials and methods” should comprise information on object of investigation, experimental setup, parameters, intervention and statistical methods. A more detailed description on the homeopathic substances, for example, manufacture, dilution method, starting point of dilution is required. A further result of the Delphi process is to raise scientists' awareness of reporting blinding, allocation, replication, quality control and system performance controls. The part “Results” should provide the exact number of treated units per setting which were included in each analysis and state missing samples and drop outs. Results presented in tables and figures are as important as appropriate measures of effect size, uncertainty and probability. “Discussion” in a report should depict more than a general interpretation of results in the context of current evidence but also limitations and an appraisal of aptitude for the chosen experimental model. Authors of homeopathic basic research publications are encouraged to apply our checklist when preparing their manuscripts. Feedback is encouraged on applicability, strength and limitations of the list to enable future revisions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

El presente artículo es una revisión detallada de estudios científicos publicados que tratan el tema relacionado con la determinación de los elementos de las tierras raras (REEs) en el sistema suelo-planta. Los estudios han sido llevados a cabo principalmente en países europeos y asiáticos. Cabe señalar que la investigación en los países latinoamericanos es muy escasa; sin embargo, es creciente el interés de analizar la aportación de estos elementos al suelo y la planta, lo cual se debe a la aplicación de fertilizantes que contienen dosis elevadas de estos elementos en su composición. Diversas técnicas de muestreo, experimentación y análisis han sido empleadas para la determinación de los REEs. No obstante, se considera que el manejo de los datos ha sido incorrecto estadísticamente. El contenido del presente artículo aborda: (i) las generalidades de los REEs; (ii) el análisis de la bibliografía disponible con el fin de conocer las metodologías de muestreo y análisis más utilizadas en 37 artículos en total, señalando algunos puntos que se consideran todavía deficientes; (iii) dos ejemplos de la aplicación de técnicas estadísticas (intervalo de confianza de la media y pruebas de significancia de la relación F de Fisher y t de Student) utilizando datos reportados en dos artículos. Los resultados mostraron, con los datos del primer artículo analizado, que: a) no se aplicó una metodología estadística para evaluar la calidad de datos; b) al aplicar estadística se encontró que existen diferencias sistemáticas entre los datos determinados en el laboratorio y los certificados. En el segundo artículo analizado se demostró, mediante pruebas de significancia, que existen diferencias significativas en las medias de Ce y Eu (los dos elementos tomados como ejemplos) en las plantas de un sitio a otro.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La acumulación de material sólido en embalses, cauces fluviales y en zonas marítimas hace que la extracción mecánica de estos materiales por medio de succión sea cada vez mas frecuente, por ello resulta importante estudiar el rendimiento de la succión de estos materiales analizando la forma de las boquillas y los parámetros del flujo incluyendo la bomba. Esta tesis estudia, mediante equipos experimentales, la eficacia de distintos dispositivos de extracción de sólidos (utilizando boquillas de diversas formas y bombas de velocidad variable). El dispositivo experimental ha sido desarrollado en el Laboratorio de Hidráulica de la E.T.S.I. de Caminos, C. y P. de la Universidad Politécnica de Madrid. Dicho dispositivo experimental incluye un lecho sumergido de distintos tipos de sedimentos, boquillas de extracción de sólidos y bomba de velocidad variable, así como un elemento de separación del agua y los sólidos extraídos. Los parámetros básicos analizados son el caudal líquido total bombeado, el caudal sólido extraído, diámetro de la tubería de succión, forma y sección de la boquilla extractora, así como los parámetros de velocidad y rendimiento en la bomba de velocidad variable. Los resultados de las medidas obtenidas en el dispositivo experimental han sido estudiados por medio del análisis dimensional y con métodos estadísticos. A partir de este estudio se ha desarrollado una nueva formulación, que relaciona el caudal sólido extraído con los diámetros de tubería y boquilla, caudal líquido bombeado y velocidad de giro de la bomba. Así mismo, desde el punto de vista práctico, se han analizado la influencia de la forma de la boquilla con la capacidad de extracción de sólidos a igualdad del resto de los parámetros, de forma que se puedan recomendar que forma de la boquilla es la más apropiada. The accumulation of solid material in reservoirs, river channels and sea areas causes the mechanical extraction of these materials by suction is becoming much more common, so it is important to study the performance of the suction of these materials analyzing the shape of the nozzles and flow parameters, including the pump. This thesis studies, using experimental equipment, the effectiveness of different solids removal devices (using nozzles of different shapes and variable speed pumps). The experimental device was developed at the Hydraulics Laboratory of the Civil University of the Polytechnic University of Madrid. The device included a submerged bed with different types of sediment solids, different removal nozzles and variable speed pump. It also includes a water separation element and the solids extracted. The key parameters analyzed are the total liquid volume pumped, the solid volume extracted, diameter of the suction pipe, a section of the nozzle and hood, and the parameters of speed and efficiency of the variable speed pump. The basic parameters analyzed are the total liquid volume pumped, the removed solid volume, the diameter of the suction pipe, the shape and cross-section of the nozzle, and the parameters of speed, efficiency and energy consumed by the variable speed pump. The measurements obtained on the experimental device have been studied with dimensional analysis and statistical methods. The outcome of this study is a new formulation, which relates the solid volume extracted with the pipe and nozzle diameters, the pumped liquid flow and the speed of the pump. Also, from a practical point of view, the influence of the shape of the nozzle was compared with the solid extraction capacity, keeping equal the rest of the parameters. So, a recommendation of the best shape of the nozzle can be given.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Actualmente son una práctica común los procesos de normalización de métodos de ensayo y acreditación de laboratorios, ya que permiten una evaluación de los procedimientos llevados a cabo por profesionales de un sector tecnológico y además permiten asegurar unos mínimos de calidad en los resultados finales. En el caso de los laboratorios de acústica, para conseguir y mantener la acreditación de un laboratorio es necesario participar activamente en ejercicios de intercomparación, utilizados para asegurar la calidad de los métodos empleados. El inconveniente de estos ensayos es el gran coste que suponen para los laboratorios, siendo en ocasiones inasumible por estos teniendo que renunciar a la acreditación. Este Proyecto Fin de Grado se centrará en el desarrollo de un Laboratorio Virtual implementado mediante una herramienta software que servirá para realizar ejercicios de intercomparación no presenciales, ampliando de ese modo el concepto e-comparison y abriendo las bases a que en un futuro este tipo de ejercicios no presenciales puedan llegar a sustituir a los llevados a cabo actualmente. En el informe primero se hará una pequeña introducción, donde se expondrá la evolución y la importancia de los procedimientos de calidad acústica en la sociedad actual. A continuación se comentará las normativas internacionales en las que se soportará el proyecto, la norma ISO 145-5, así como los métodos matemáticos utilizados en su implementación, los métodos estadísticos de propagación de incertidumbres especificados por la JCGM (Joint Committee for Guides in Metrology). Después, se hablará sobre la estructura del proyecto, tanto del tipo de programación utilizada en su desarrollo como la metodología de cálculo utilizada para conseguir que todas las funcionalidades requeridas en este tipo de ensayo estén correctamente implementadas. Posteriormente se llevará a cabo una validación estadística basada en la comparación de unos datos generados por el programa, procesados utilizando la simulación de Montecarlo, y unos cálculos analíticos, que permita comprobar que el programa funciona tal y como se ha previsto en la fase de estudio teórico. También se realizará una prueba del programa, similar a la que efectuaría un técnico de laboratorio, en la que se evaluará la incertidumbre de la medida calculándola mediante el método tradicional, pudiendo comparar los datos obtenidos con los que deberían obtenerse. Por último, se comentarán las conclusiones obtenidas con el desarrollo y pruebas del Laboratorio Virtual, y se propondrán nuevas líneas de investigación futuras relacionadas con el concepto e-comparison y la implementación de mejoras al Laboratorio Virtual. ABSTRACT. Nowadays it is common practise to make procedures to normalise trials methods standards and laboratory accreditations, as they allow for the evaluation of the procedures made by professionals from a particular technological sector in addition to ensuring a minimum quality in the results. In order for an acoustics laboratory to achieve and maintain the accreditation it is necessary to actively participate in the intercomparison exercises, since these are used to assure the quality of the methods used by the technicians. Unfortunately, the high cost of these trials is unaffordable for many laboratories, which then have to renounce to having the accreditation. This Final Project is focused on the development of a Virtual Laboratory implemented by a software tool that it will be used for making non-attendance intercomparison trials, widening the concept of e-comparison and opening the possibility for using this type of non-attendance trials instead of the current ones. First, as a short introduction, I show the evolution and the importance today of acoustic quality procedures. Second, I will discuss the international standards, such as ISO 145-5, as well the mathematic and statistical methods of uncertainty propagation specified by the Joint Committee for Guides in Metrology, that are used in the Project. Third, I speak about the structure of the Project, as well as the programming language structure and the methodology used to get the different features needed in this acoustic trial. Later, a statistical validation will be carried out, based on comparison of data generated by the program, processed using a Montecarlo simulation, and analytical calculations to verify that the program works as planned in the theoretical study. There will also be a test of the program, similar to one that a laboratory technician would carry out, by which the uncertainty in the measurement will be compared to a traditional calculation method so as to compare the results. Finally, the conclusions obtained with the development and testing of the Virtual Laboratory will be discussed, new research paths related to e-comparison definition and the improvements for the Laboratory will be proposed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Los fenómenos dinámicos pueden poner en peligro la integridad de estructuras aeroespaciales y los ingenieros han desarrollado diferentes estrategias para analizarlos. Uno de los grandes problemas que se plantean en la ingeniería es cómo atacar un problema dinámico estructural. En la presente tesis se plantean distintos fenómenos dinámicos y se proponen métodos para estimar o simular sus comportamientos mediante un análisis paramétrico determinista y aleatorio del problema. Se han propuesto desde problemas sencillos con pocos grados de libertad que sirven para analizar las diferentes estrategias y herramientas a utilizar, hasta fenómenos muy dinámicos que contienen comportamientos no lineales, daños y fallos. Los primeros ejemplos de investigación planteados cubren una amplia gama de los fenómenos dinámicos, como el análisis de vibraciones de elementos másicos, incluyendo impactos y contactos, y el análisis de una viga con carga armónica aplicada a la que también se le añaden parámetros aleatorios que pueden responder a un desconocimiento o incertidumbre de los mismos. Durante el desarrollo de la tesis se introducen conceptos y se aplican distintos métodos, como el método de elementos finitos (FEM) en el que se analiza su resolución tanto por esquemas implícitos como explícitos, y métodos de análisis paramétricos y estadísticos mediante la técnica de Monte Carlo. Más adelante, una vez ya planteadas las herramientas y estrategias de análisis, se estudian fenómenos más complejos, como el impacto a baja velocidad en materiales compuestos, en el que se busca evaluar la resistencia residual y, por lo tanto, la tolerancia al daño de la estructura. Se trata de un suceso que puede producirse por la caída de herramienta, granizo o restos en la pista de aterrizaje. Otro de los fenómenos analizados también se da en un aeropuerto y se trata de la colisión con un dispositivo frangible, el cual tiene que romperse bajo ciertas cargas y, sin embargo, soportar otras. Finalmente, se aplica toda la metodología planteada en simular y analizar un posible incidente en vuelo, el fenómeno de la pérdida de pala de un turbohélice. Se trata de un suceso muy particular en el que la estructura tiene que soportar unas cargas complejas y excepcionales con las que la aeronave debe ser capaz de completar con éxito el vuelo. El análisis incluye comportamientos no lineales, daños, y varios tipos de fallos, y en el que se trata de identificar los parámetros clave en la secuencia del fallo. El suceso se analiza mediante análisis estructurales deterministas más habituales y también mediante otras técnicas como el método de Monte Carlo con el que se logran estudiar distintas incertidumbres en los parámetros con variables aleatorias. Se estudian, entre otros, el tamaño de pala perdida, la velocidad y el momento en el que se produce la rotura, y la rigidez y resistencia de los apoyos del motor. Se tiene en cuenta incluso el amortiguamiento estructural del sistema. Las distintas estrategias de análisis permiten obtener unos resultados valiosos e interesantes que han sido objeto de distintas publicaciones. ABSTRACT Dynamic phenomena can endanger the integrity of aerospace structures and, consequently, engineers have developed different strategies to analyze them. One of the major engineering problems is how to deal with the structural dynamics. In this thesis, different dynamic phenomena are introduced and several methods are proposed to estimate or simulate their behaviors. The analysis is considered through parametric, deterministic and statistical methods. The suggested issues are from simple problems with few degrees of freedom, in order to develop different strategies and tools to solve them, to very dynamic phenomena containing nonlinear behaviors failures, damages. The first examples cover a wide variety of dynamic phenomena such as vibration analysis of mass elements, including impacts and contacts, and beam analysis with harmonic load applied, in which random parameters are included. These parameters can represent the unawareness or uncertainty of certain variables. During the development of the thesis several concepts are introduced and different methods are applied, such as the finite element method (FEM), which is solved through implicit and explicit schemes, and parametrical and statistical methods using the Monte Carlo analysis technique. Next, once the tools and strategies of analysis are set out more complex phenomena are studied. This is the case of a low-speed impact in composite materials, the residual strength of the structure is evaluated, and therefore, its damage tolerance. This incident may occur from a tool dropped, hail or debris throw on the runway. At an airport may also occur, and it is also analyzed, a collision between an airplane and a frangible device. The devise must brake under these loads, however, it must withstand others. Finally, all the considered methodology is applied to simulate and analyze a flight incident, the blade loss phenomenon of a turboprop. In this particular event the structure must support complex and exceptional loads and the aircraft must be able to successfully complete the flight. Nonlinear behavior, damage, and different types of failures are included in the analysis, in which the key parameters in the failure sequence are identified. The incident is analyzed by deterministic structural analysis and also by other techniques such as Monte Carlo method, in which it is possible to include different parametric uncertainties through random variables. Some of the evaluated parameters are, among others, the blade loss size, propeller rotational frequency, speed and angular position where the blade is lost, and the stiffness and strength of the engine mounts. The study does also research on the structural damping of the system. The different strategies of analysis obtain valuable and interesting results that have been already published.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

On cover: QC manual.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Solving many scientific problems requires effective regression and/or classification models for large high-dimensional datasets. Experts from these problem domains (e.g. biologists, chemists, financial analysts) have insights into the domain which can be helpful in developing powerful models but they need a modelling framework that helps them to use these insights. Data visualisation is an effective technique for presenting data and requiring feedback from the experts. A single global regression model can rarely capture the full behavioural variability of a huge multi-dimensional dataset. Instead, local regression models, each focused on a separate area of input space, often work better since the behaviour of different areas may vary. Classical local models such as Mixture of Experts segment the input space automatically, which is not always effective and it also lacks involvement of the domain experts to guide a meaningful segmentation of the input space. In this paper we addresses this issue by allowing domain experts to interactively segment the input space using data visualisation. The segmentation output obtained is then further used to develop effective local regression models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

2000 Mathematics Subject Classification: 62P10, 92C20

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A principios de 1990, los documentalistas comienzan a interesarse en hacer aplicaciones matemáticas y estadísticas en las unidades bibliográficas. F. J. Coles y Nellie B. Eales en 1917 hicieron el primer estudio con un grupo de títulos de documentos cuyo análisis consideraba el país de origen (White, p. 35). En 1923, E. Wyndham Hulme fue la primera persona en usar el término "estadísticas bibliográficas".Y propuso la utilización de métodos estadísticos para tener parámetros que sirvan para conocer el proceso de la comunicación escrita y, la naturaleza y curso del desarrollo de una disciplina. Para lograr ese aspecto empezó contando un número de documentos y analizando varias facetas de la comunicación escrita empleada en ellos (Ferrante, p. 201). En un documento escrito en 1969, Alan Pritchard propuso el término bibliometría para reemplazar el término "estadísticas bibliográficas" empleado por Hulme, argumentando que el, término es ambiguo, no muy descriptivo y que puede ser confundido con las estadísticas puras o estadísticas de bibliografías. El definió el término bibliometría como la aplicación de la matemática y métodos estadísticos a los libros y otros documentos (p. 348-349). Y desde ese momento se ha utilizado este término.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The main objective on this research is to evaluate and detect the profile of development and possible deviations that expouses a high biological, social and / or environmental risk on Indigenous children under three years old who attend to Child Centers of Good Living in the Indigenous Community of Salasaca - Tungurahua. The study was conducted with 90 children whom attend these centers. The Bayley Scale for Infant Development III (Bayley Scales for Infant Development - BSID) was applied to the study, for it was a standardized international test for scientific research on child development. It values the state of development and identifies deficits on children from 0 to 42 months old. Two types of questionnaires were applied to the Technical Assistants Child Development and to the Daily care Child Centers. Another querionary was used for the mothers of this centers in order to establish the level of knowledge over some factors, which may influence on child development during early childhood, as well as the parenting patterns characteristic of the indigenous community of Salasaca- Tungurahua. Analytical, synthetical and statistical methods for results interpretation were used for the Theoretical framework of this work. The scope of this research is the socio educative type. It benefits generally to all society particularly to children on infant stage and to the Indigenous Community of Salasaca...

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The discovery of protein variation is an important strategy in disease diagnosis within the biological sciences. The current benchmark for elucidating information from multiple biological variables is the so called “omics” disciplines of the biological sciences. Such variability is uncovered by implementation of multivariable data mining techniques which come under two primary categories, machine learning strategies and statistical based approaches. Typically proteomic studies can produce hundreds or thousands of variables, p, per observation, n, depending on the analytical platform or method employed to generate the data. Many classification methods are limited by an n≪p constraint, and as such, require pre-treatment to reduce the dimensionality prior to classification. Recently machine learning techniques have gained popularity in the field for their ability to successfully classify unknown samples. One limitation of such methods is the lack of a functional model allowing meaningful interpretation of results in terms of the features used for classification. This is a problem that might be solved using a statistical model-based approach where not only is the importance of the individual protein explicit, they are combined into a readily interpretable classification rule without relying on a black box approach. Here we incorporate statistical dimension reduction techniques Partial Least Squares (PLS) and Principal Components Analysis (PCA) followed by both statistical and machine learning classification methods, and compared them to a popular machine learning technique, Support Vector Machines (SVM). Both PLS and SVM demonstrate strong utility for proteomic classification problems.