9 resultados para Standard Mouse Neutralization Test
em Universidad Polit
Resumo:
The present study analyzed the differences in distance throwing with heavy and light medicine ball and throwing velocity between handball players of different competitive and professional level. Likewise, the relationship between the three throwing test of progressive specificity was analyzed: throwing with heavy medicinal ball (TH), throwing with light medicinal ball (TL) and throwing velocity (TV). For this purpose, sixty-five professional (P), semiprofessional (S) and non-professional (N) players were evaluated. El presente estudio analizó las diferencias en la distancia de lanzamiento realizado con balón medicinal pesado y ligero y en la velocidad de lanzamiento entre jugadores de balonmano de diferente nivel competitivo y profesional. Igualmente, la relación entre los tres test de lanzamiento, de progresiva especificidad, fue analizado: lanzamiento con balón medicinal pesado (TH), lanzamiento con balón medicinal ligero (TL) y velocidad de lanzamiento (TV). Para ello, sesenta y cinco jugadores profesionales (P), semi-profesionales (S) y no-profesionales (N) fueron evaluados.
Resumo:
Photovoltaic modules based on thin film technology are gaining importance in the photovoltaic market, and module installers and plant owners have increasingly begun to request methods of performing module quality control. These modules pose additional problems for measuring power under standard test conditions (STC), beyond problems caused by the temperature of the module and the ambient variables. The main difficulty is that the modules’ power rates may vary depending both on the amount of time they have been exposed to the sun during recent hours and on their history of sunlight exposure. In order to assess the current state of the module, it is necessary to know its sunlight exposure history. Thus, an easily accomplishable testing method that ensures the repeatability of the measurements of the power generated is needed. This paper examines different tests performed on commercial thin film PV modules of CIS, a-Si and CdTe technologies in order to find the best way to obtain measurements. A method for obtaining indoor measurements of these technologies that takes into account periods of sunlight exposure is proposed. Special attention is paid to CdTe as a fast growing technology in the market.
Resumo:
Thin film photovoltaic (TF) modules have gained importance in the photovoltaic (PV) market. New PV plants increasingly use TF technologies. In order to have a reliable sample of a PV module population, a huge number of modules must be measured. There is a big variety of materials used in TF technology. Some of these modules are made of amorphous or microcrystalline silicon. Other are made of CIS or CdTe. Not all these materials respond the same under standard test conditions (STC) of power measurement. Power rates of the modules may vary depending on both the extent and the history of sunlight exposure. Thus, it is necessary a testing method adapted to each TF technology. This test must guarantee repeatability of measurements of generated power. This paper shows responses of different commercial TF PV modules to sunlight exposure. Several test procedures were performed in order to find the best methodology to obtain measurements of TF PV modules at STC in the easiest way. A methodology for indoor measurements adapted to these technologies is described.
Resumo:
Underground coal mines explosions generally arise from the inflammation of a methane/air mixture. This explosion can also generate a subsequent coal dust explosion. Traditionally such explosions have being fought eliminating one or several of the factors needed by the explosion to take place. Although several preventive measures are taken to prevent explosions, other measures should be considered to reduce the effects or even to extinguish the flame front. Unlike other protection methods that remove one or two of the explosion triangle elements, namely; the ignition source, the oxidizing agent and the fuel, explosion barriers removes all of them: reduces the quantity of coal in suspension, cools the flame front and the steam generated by vaporization removes the oxygen present in the flame. Passive water barriers are autonomous protection systems against explosions that reduce to a satisfactory safety level the effects of methane and/or flammable dust explosions. The barriers are activated by the pressure wave provoked in the explosion destroying the barrier troughs and producing a uniform dispersion of the extinguishing agent throughout the gallery section in quantity enough to extinguish the explosion flame. Full scale tests have been carried out in Polish Barbara experimental mine at GIG Central Mining Institute in order to determine the requirements and the optimal installation conditions of these devices for small sections galleries which are very frequent in the Spanish coal mines. Full scale tests results have been analyzed to understand the explosion timing and development, in order to assess on the use of water barriers in the typical small crosssection Spanish galleries. Several arrangements of water barriers have been designed and tested to verify the effectiveness of the explosion suppression in each case. The results obtained demonstrate the efficiency of the water barriers in stopping the flame front even with smaller amounts of water than those established by the European standard. According to the tests realized, water barriers activation times are between 0.52 s and 0.78 s and the flame propagation speed are between 75 m/s and 80 m/s. The maximum pressures (Pmax) obtained in the full scale tests have varied between 0.2 bar and 1.8 bar. Passive barriers protect effectively against the spread of the flame but cannot be used as a safeguard of the gallery between the ignition source and the first row of water troughs or bags, or even after them, as the pressure could remain high after them even if the flame front has been extinguished.
Resumo:
The International Standard ISO 140-5 on field measurements of airborne sound insulation of façades establishes that the directivity of the measurement loudspeaker should be such that the variation in the local direct sound pressure level (ΔSPL) on the sample is ΔSPL < 5 dB (or ΔSPL < 10 dB for large façades). This condition is usually not very easy to accomplish nor is it easy to verify whether the loudspeaker produces such a uniform level. Direct sound pressure levels on the ISO standard façade essentially depend on the distance and directivity of the loudspeaker used. This paper presents a comprehensive analysis of the test geometry for measuring sound insulation and explains how the loudspeaker directivity, combined with distance, affects the acoustic level distribution on the façade. The first sections of the paper are focused on analysing the measurement geometry and its influence on the direct acoustic level variations on the façade. The most favourable and least favourable positions to minimise these direct acoustic level differences are found, and the angles covered by the façade in the reference system of the loudspeaker are also determined. Then, the maximum dimensions of the façade that meet the conditions of the ISO 140-5 standard are obtained for the ideal omnidirectional sound source and the piston radiating in an infinite baffle, which is chosen as the typical radiation pattern for loudspeakers. Finally, a complete study of the behaviour of different loudspeaker radiation models (such as those usually utilised in the ISO 140-5 measurements) is performed, comparing their radiation maps on the façade for searching their maximum dimensions and the most appropriate radiation configurations.
Resumo:
Las pruebas de software (Testing) son en la actualidad la técnica más utilizada para la validación y la evaluación de la calidad de un programa. El testing está integrado en todas las metodologías prácticas de desarrollo de software y juega un papel crucial en el éxito de cualquier proyecto de software. Desde las unidades de código más pequeñas a los componentes más complejos, su integración en un sistema de software y su despliegue a producción, todas las piezas de un producto de software deben ser probadas a fondo antes de que el producto de software pueda ser liberado a un entorno de producción. La mayor limitación del testing de software es que continúa siendo un conjunto de tareas manuales, representando una buena parte del coste total de desarrollo. En este escenario, la automatización resulta fundamental para aliviar estos altos costes. La generación automática de casos de pruebas (TCG, del inglés test case generation) es el proceso de generar automáticamente casos de prueba que logren un alto recubrimiento del programa. Entre la gran variedad de enfoques hacia la TCG, esta tesis se centra en un enfoque estructural de caja blanca, y más concretamente en una de las técnicas más utilizadas actualmente, la ejecución simbólica. En ejecución simbólica, el programa bajo pruebas es ejecutado con expresiones simbólicas como argumentos de entrada en lugar de valores concretos. Esta tesis se basa en un marco general para la generación automática de casos de prueba dirigido a programas imperativos orientados a objetos (Java, por ejemplo) y basado en programación lógica con restricciones (CLP, del inglés constraint logic programming). En este marco general, el programa imperativo bajo pruebas es primeramente traducido a un programa CLP equivalente, y luego dicho programa CLP es ejecutado simbólicamente utilizando los mecanismos de evaluación estándar de CLP, extendidos con operaciones especiales para el tratamiento de estructuras de datos dinámicas. Mejorar la escalabilidad y la eficiencia de la ejecución simbólica constituye un reto muy importante. Es bien sabido que la ejecución simbólica resulta impracticable debido al gran número de caminos de ejecución que deben ser explorados y a tamaño de las restricciones que se deben manipular. Además, la generación de casos de prueba mediante ejecución simbólica tiende a producir un número innecesariamente grande de casos de prueba cuando es aplicada a programas de tamaño medio o grande. Las contribuciones de esta tesis pueden ser resumidas como sigue. (1) Se desarrolla un enfoque composicional basado en CLP para la generación de casos de prueba, el cual busca aliviar el problema de la explosión de caminos interprocedimiento analizando de forma separada cada componente (p.ej. método) del programa bajo pruebas, almacenando los resultados y reutilizándolos incrementalmente hasta obtener resultados para el programa completo. También se ha desarrollado un enfoque composicional basado en especialización de programas (evaluación parcial) para la herramienta de ejecución simbólica Symbolic PathFinder (SPF). (2) Se propone una metodología para usar información del consumo de recursos del programa bajo pruebas para guiar la ejecución simbólica hacia aquellas partes del programa que satisfacen una determinada política de recursos, evitando la exploración de aquellas partes del programa que violan dicha política. (3) Se propone una metodología genérica para guiar la ejecución simbólica hacia las partes más interesantes del programa, la cual utiliza abstracciones como generadores de trazas para guiar la ejecución de acuerdo a criterios de selección estructurales. (4) Se propone un nuevo resolutor de restricciones, el cual maneja eficientemente restricciones sobre el uso de la memoria dinámica global (heap) durante ejecución simbólica, el cual mejora considerablemente el rendimiento de la técnica estándar utilizada para este propósito, la \lazy initialization". (5) Todas las técnicas propuestas han sido implementadas en el sistema PET (el enfoque composicional ha sido también implementado en la herramienta SPF). Mediante evaluación experimental se ha confirmado que todas ellas mejoran considerablemente la escalabilidad y eficiencia de la ejecución simbólica y la generación de casos de prueba. ABSTRACT Testing is nowadays the most used technique to validate software and assess its quality. It is integrated into all practical software development methodologies and plays a crucial role towards the success of any software project. From the smallest units of code to the most complex components and their integration into a software system and later deployment; all pieces of a software product must be tested thoroughly before a software product can be released. The main limitation of software testing is that it remains a mostly manual task, representing a large fraction of the total development cost. In this scenario, test automation is paramount to alleviate such high costs. Test case generation (TCG) is the process of automatically generating test inputs that achieve high coverage of the system under test. Among a wide variety of approaches to TCG, this thesis focuses on structural (white-box) TCG, where one of the most successful enabling techniques is symbolic execution. In symbolic execution, the program under test is executed with its input arguments being symbolic expressions rather than concrete values. This thesis relies on a previously developed constraint-based TCG framework for imperative object-oriented programs (e.g., Java), in which the imperative program under test is first translated into an equivalent constraint logic program, and then such translated program is symbolically executed by relying on standard evaluation mechanisms of Constraint Logic Programming (CLP), extended with special treatment for dynamically allocated data structures. Improving the scalability and efficiency of symbolic execution constitutes a major challenge. It is well known that symbolic execution quickly becomes impractical due to the large number of paths that must be explored and the size of the constraints that must be handled. Moreover, symbolic execution-based TCG tends to produce an unnecessarily large number of test cases when applied to medium or large programs. The contributions of this dissertation can be summarized as follows. (1) A compositional approach to CLP-based TCG is developed which overcomes the inter-procedural path explosion by separately analyzing each component (method) in a program under test, stowing the results as method summaries and incrementally reusing them to obtain whole-program results. A similar compositional strategy that relies on program specialization is also developed for the state-of-the-art symbolic execution tool Symbolic PathFinder (SPF). (2) Resource-driven TCG is proposed as a methodology to use resource consumption information to drive symbolic execution towards those parts of the program under test that comply with a user-provided resource policy, avoiding the exploration of those parts of the program that violate such policy. (3) A generic methodology to guide symbolic execution towards the most interesting parts of a program is proposed, which uses abstractions as oracles to steer symbolic execution through those parts of the program under test that interest the programmer/tester most. (4) A new heap-constraint solver is proposed, which efficiently handles heap-related constraints and aliasing of references during symbolic execution and greatly outperforms the state-of-the-art standard technique known as lazy initialization. (5) All techniques above have been implemented in the PET system (and some of them in the SPF tool). Experimental evaluation has confirmed that they considerably help towards a more scalable and efficient symbolic execution and TCG.
Resumo:
For safety barriers the load bearing capacity of the glass when subjected to the soft body impact should be verified. The soft body pendulum test became a testing standard to classify safety glass plates. The classification of the safety glass do not consider the structural behavior when one sheet of a laminated glass is broken; in situations when the replacement of the plate could not be very urgent, structural behavior should be evaluated. The main objective of this paper is to present the structural behavior o laminated glass plates, though modal test and human impact test, including the post fracture behavior for the laminated cases. A god reproducibility and repeatability is obtained. Two main aspects of the structural behavior can be observed: the increment of the rupture load for laminated plates after the failure of the first sheet, and some similarities with a tempered monolithic behavior of equivalent thickness.
Resumo:
En este documento se detalla, la planificación y elaboración de un paquete que respeta el estándar S4 de programación en lenguaje R. El paquete consiste en una serie de métodos y clases para la generación de exámenes tipos test y soluciones a partir de un archivo xls, que hace las funciones de una base de datos. El diseño propuesto está orientado a objetos y desarrolla un conjunto de clases que representan los contenidos de una prueba de evaluación tipo test: enunciados, peguntas y respuestas. Se ha realizado una implementación sencilla de un prototipo con las funciones básicas necesarias para generar los tests. Además se ha generado la documentación necesaria para crear el paquete, esto significa que cada método tiene una página de ayuda, que se podrá consultar desde un terminal con R, dicha documentación incluye ejemplos de ejecución de cada método.---ABSTRACT---In this document is detailed the elaboration and development of a package that meets the standard S4 of programming language R. This package consists of a group of methods and classes used for the generation of test exams and their solutions starting from a xls format file wich plays the role of a data base at the same time. These classes have been grouped in a way that the user could have a complete and easy vision of them. This division has been done by using data storage and functions whose tasks are more or less the same. Furthermore, the necessary documentation to create this package has also been generated, that means that every method has a help page wich can be called from a R terminal if necessary. This documentation has examples of the execution of every method.
Resumo:
Se va a realizar un estudio de la codificación de imágenes sobre el estándar HEVC (high-effiency video coding). El proyecto se va a centrar en el codificador híbrido, más concretamente sobre la aplicación de la transformada inversa del coseno que se realiza tanto en codificador como en el descodificador. La necesidad de codificar vídeo surge por la aparición de la secuencia de imágenes como señales digitales. El problema principal que tiene el vídeo es la cantidad de bits que aparecen al realizar la codificación. Como consecuencia del aumento de la calidad de las imágenes, se produce un crecimiento exponencial de la cantidad de información a codificar. La utilización de las transformadas al procesamiento digital de imágenes ha aumentado a lo largo de los años. La transformada inversa del coseno se ha convertido en el método más utilizado en el campo de la codificación de imágenes y video. Las ventajas de la transformada inversa del coseno permiten obtener altos índices de compresión a muy bajo coste. La teoría de las transformadas ha mejorado el procesamiento de imágenes. En la codificación por transformada, una imagen se divide en bloques y se identifica cada imagen a un conjunto de coeficientes. Esta codificación se aprovecha de las dependencias estadísticas de las imágenes para reducir la cantidad de datos. El proyecto realiza un estudio de la evolución a lo largo de los años de los distintos estándares de codificación de video. Se analiza el codificador híbrido con más profundidad así como el estándar HEVC. El objetivo final que busca este proyecto fin de carrera es la realización del núcleo de un procesador específico para la ejecución de la transformada inversa del coseno en un descodificador de vídeo compatible con el estándar HEVC. Es objetivo se logra siguiendo una serie de etapas, en las que se va añadiendo requisitos. Este sistema permite al diseñador hardware ir adquiriendo una experiencia y un conocimiento más profundo de la arquitectura final. ABSTRACT. A study about the codification of images based on the standard HEVC (high-efficiency video coding) will be developed. The project will be based on the hybrid encoder, in particular, on the application of the inverse cosine transform, which is used for the encoder as well as for the decoder. The necessity of encoding video arises because of the appearance of the sequence of images as digital signals. The main problem that video faces is the amount of bits that appear when making the codification. As a consequence of the increase of the quality of the images, an exponential growth on the quantity of information that should be encoded happens. The usage of transforms to the digital processing of images has increased along the years. The inverse cosine transform has become the most used method in the field of codification of images and video. The advantages of the inverse cosine transform allow to obtain high levels of comprehension at a very low price. The theory of the transforms has improved the processing of images. In the codification by transform, an image is divided in blocks and each image is identified to a set of coefficients. This codification takes advantage of the statistic dependence of the images to reduce the amount of data. The project develops a study of the evolution along the years of the different standards in video codification. In addition, the hybrid encoder and the standard HEVC are analyzed more in depth. The final objective of this end of degree project is the realization of the nucleus from a specific processor for the execution of the inverse cosine transform in a decoder of video that is compatible with the standard HEVC. This objective is reached following a series of stages, in which requirements are added. This system allows the hardware designer to acquire a deeper experience and knowledge of the final architecture.