14 resultados para Post and core technique

em Universidad Politécnica de Madrid


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Oxygen 1s excitation and ionization processes in the CO2 molecule have been studied with dispersed and non-dispersed fluorescence spectroscopy as well as with the vacuum ultraviolet (VUV) photon?photoion coincidence technique. The intensity of the neutral O emission line at 845 nm shows particular sensitivity to core-to-Rydberg excitations and core?valence double excitations, while shape resonances are suppressed. In contrast, the partial fluorescence yield in the wavelength window 300?650 nm and the excitation functions of selected O+ and C+ emission lines in the wavelength range 400?500 nm display all of the absorption features. The relative intensity of ionic emission in the visible range increases towards higher photon energies, which is attributed to O 1s shake-off photoionization. VUV photon?photoion coincidence spectra reveal major contributions from the C+ and O+ ions and a minor contribution from C2+. No conclusive changes in the intensity ratios among the different ions are observed above the O 1s threshold. The line shape of the VUV?O+ coincidence peak in the mass spectrum carries some information on the initial core excitation

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Las personas que usan la silla de ruedas como su forma de movilidad prioritaria presentan una elevada incidencia (73%) de dolor de hombro debido al sobreuso y al movimiento repetitivo de la propulsión. Existen numerosos métodos de diagnóstico para la detección de las patologías del hombro, sin embargo la literatura reclama la necesidad de un test no invasivo y fiable, y sugiere la termografía como una técnica adecuada para evaluar el dolor articular. La termografía infrarroja (IRT) proporciona información acerca de los procesos fisiológicos a través del estudio de las distribuciones de la temperatura de la piel. Debido a la alta correlación entre ambos lados corporales, las asimetrías térmicas entre flancos contralaterales son una buena indicación de patologías o disfunciones físicas subyacentes. La fiabilidad de la IRT ha sido estudiada con anterioridad en sujetos sanos, pero nunca en usuarios de sillas de ruedas. Las características especiales de la población con discapacidad (problemas de sudoración y termorregulación, distribución sanguínea o medicación), hacen necesario estudiar los factores que afectan a la aplicación de la IRT en usuarios de sillas de ruedas. La bibliografía discrepa en cuanto a los beneficios o daños resultantes de la práctica de la actividad física en las lesiones de hombro por sobreuso en usuarios de sillas de ruedas. Recientes resultados apuntan a un aumento del riesgo de rotura del manguito rotador en personas con paraplejia que practican deportes con elevación del brazo por encima de la cabeza. Debido a esta falta de acuerdo en la literatura, surge la necesidad de analizar el perfil termográfico en usuarios de sillas de ruedas sedentarios y deportistas y su relación con el dolor de hombro. Hasta la fecha sólo se han publicado estudios termográficos durante el ejercicio en sujetos sanos. Un mayor entendimiento de la respuesta termográfica al ejercicio en silla de ruedas en relación al dolor de hombro clarificará su aparición y desarrollo y permitirá una apropiada intervención. El primer estudio demuestra que la fiabilidad de la IRT en usuarios de sillas de ruedas varía dependiendo de las zonas analizadas, y corrobora que la IRT es una técnica no invasiva, de no contacto, que permite medir la temperatura de la piel, y con la cual avanzar en la investigación en usuarios de sillas de ruedas. El segundo estudio proporciona un perfil de temperatura para usuarios de sillas de ruedas. Los sujetos no deportistas presentaron mayores asimetrías entre lados corporales que los sedentarios, y ambos obtuvieron superiores asimetrías que los sujetos sin discapacidad reportados en la literatura. Los no deportistas también presentaron resultados más elevados en el cuestionario de dolor de hombro. El área con mayores asimetrías térmicas fue hombro. En deportistas, algunas regiones de interés (ROIs) se relacionaron con el dolor de hombro. Estos resultados ayudan a entender el mapa térmico en usuarios de sillas de ruedas. El último estudio referente a la evaluación de la temperatura de la piel en usuarios de sillas de ruedas en ejercicio, reportó diferencias significativas entre la temperatura de la piel antes del test y 10 minutos después del test de propulsión de silla de ruedas, en 12 ROIs; y entre el post-test y 10 minutos después del test en la mayoría de las ROIs. Estas diferencias se vieron atenuadas cuando se compararon las asimetrías antes y después del test. La temperatura de la piel tendió a disminuir inmediatamente después completar el ejercicio, e incrementar significativamente 10 minutos después. El análisis de las asimetrías vs dolor de hombro reveló relaciones significativas negativas en 5 de las 26 ROIs. No se encontraron correlaciones significativas entre las variables de propulsión y el cuestionario de dolor de hombro. Todas las variables cinemáticas correlacionaron significativamente con las asimetrías en múltiples ROIs. Estos resultados indican que los deportistas en sillas de ruedas exhiben una capacidad similar de producir calor que los deportistas sin discapacidad; no obstante, su patrón térmico es más característico de ejercicios prolongados que de esfuerzos breves. Este trabajo contribuye al conocimiento de la termorregulación en usuarios de sillas de ruedas durante el ejercicio, y aporta información relevante para programas deportivos y de rehabilitación. ABSTRACT Individuals who use wheelchairs as their main means of mobility have a high incidence (73%) of shoulder pain (SP) owing to overuse and repetitive propulsion movement. There are numerous diagnostic methods for the detection of shoulder pathologies, however the literature claims that a noninvasive accurate test to properly assess shoulder pain would be necessary, and suggests thermography as a suitable technique for joint pain evaluation. Infrared thermography (IRT) provides information about physiological processes by studying the skin temperature (Tsk) distributions. Due to the high correlation of skin temperature between both sides of the body, thermal asymmetries between contralateral flanks are an indicator of underlying pathologies or physical dysfunctions. The reliability of infrared thermography has been studied in healthy subjects but there are no studies that have analyzed the reliability of IRT in wheelchair users (WCUs). The special characteristics of people with disabilities (sweating and thermoregulation problems, or blood distribution) make it necessary to study the factors affecting the application of IRT in WCUs. Discrepant reports exist on the benefits of, or damage resulting from, physical exercise and the relationship to shoulder overuse injuries in WCUs. Recent findings have found that overhead sports increase the risk of rotator cuff tears in wheelchair patients with paraplegia. Since there is no agreement in the literature, the thermographic profile of wheelchair athletes and nonathletes and its relation with shoulder pain should also be analysed. Infrared thermographic studies during exercise have been carried out only with able-bodied population at present. The understanding of the thermographic response to wheelchair exercise in relation to shoulder pain will offer an insight into the development of shoulder pain, which is necessary for appropriate interventions. The first study presented in this thesis demonstrates that the reliability of IRT in WCUs varies depending on the areas of the body that are analyzed. Moreover, it corroborates that IRT is a noninvasive and noncontact technique that allows the measurement of Tsk, which will allow for advances to be made in research concerned with WCUs. The second study provides a thermal profile of WCUs. Nonathletic subjects presented higher side-to-side skin temperature differences (ΔTsk) than athletes, and both had greater ΔTsk than the able-bodied results that have been published in the literature. Nonathletes also revealed larger Wheelchair Users Shoulder Pain Index (WUSPI) score than athletes. The shoulder region of interest (ROI) was the area with the highest ΔTsk of the regions measured. The analysis of the athletes’ Tsk showed that some ROIs are related to shoulder pain. These findings help to understand the thermal map in WCUs. Finally, the third study evaluated the thermal response of WCUs in exercise. There were significant differences in Tsk between the pre-test and the post-10 min in 12 ROIs, and between the post-test and the post-10 in most of the ROIs. These differences were attenuated when the ΔTsk was compared before and after exercise. Skin temperature tended to initially decrease immediately after the test, followed by a significant increase at 10 minutes after completing the exercise. The ΔTsk versus shoulder pain analysis yielded significant inverse relationships in 5 of the 26 ROIs. No significant correlations between propulsion variables and the results of the WUSPI questionnaire were found. All kinematic variables were significantly correlated with the temperature asymmetries in multiple ROIs. These results present indications that high performance wheelchair athletes exhibit similar capacity of heat production to able-bodied population; however, they presented a thermal pattern more characteristic of a prolonged exercise rather than brief exercise. This work contributes to improve the understanding about temperature changes in wheelchair athletes during exercise and provides implications to the sports and rehabilitation programs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The emerging use of real-time 3D-based multimedia applications imposes strict quality of service (QoS) requirements on both access and core networks. These requirements and their impact to provide end-to-end 3D videoconferencing services have been studied within the Spanish-funded VISION project, where different scenarios were implemented showing an agile stereoscopic video call that might be offered to the general public in the near future. In view of the requirements, we designed an integrated access and core converged network architecture which provides the requested QoS to end-to-end IP sessions. Novel functional blocks are proposed to control core optical networks, the functionality of the standard ones is redefined, and the signaling improved to better meet the requirements of future multimedia services. An experimental test-bed to assess the feasibility of the solution was also deployed. In such test-bed, set-up and release of end-to-end sessions meeting specific QoS requirements are shown and the impact of QoS degradation in terms of the user perceived quality degradation is quantified. In addition, scalability results show that the proposed signaling architecture is able to cope with large number of requests introducing almost negligible delay.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Wireless communication is the transfer of information from one place to another without using wires. From the earliest times, humans have felt the need to develop techniques of remote communication. From this need arose the smoke signals, communication by sun reflection in mirrors and so on. But today the telecommunications electronic devices such as telephone, television, radio or computer. Radio and television are used for one-way communication. Telephone and computer are used for two-way communication. In wireless networks there is almost unlimited mobility, we can access the network almost anywhere or anytime. In wired networks we have the restriction of using the services in fixed area services. The demand of the wireless is increasing very fast; everybody wants broadband services anywhere anytime. WiMAX (Worldwide Interoperability for Microwave Access) is a broadband wireless technology based on IEEE 802.16-2004 and IEEE 802.16e-2005 that appears to solve this demand. WIMAX is a system that allows wireless data transmission in areas of up to 48 km of radius. It is designed as a wireless alternative to ADSL and a way to connect nodes in wireless metropolitan areas network. Unlike wireless systems that are limited in most cases, about 100 meter, providing greater coverage and more bandwidth. WIMAX promises to achieve high data transmission rates over large areas with a great amount of users. This alternative to the networks of broadband access common as DSL o Wi-Fi, can give broadband access to places quickly to rural areas and developing areas around the world. This paper is a study of WIMAX technology and market situation. First, the paper is responsible for explaining the technical aspects of WIMAX. For this gives an overview of WIMAX standards, physical layer, MAC layer and WiMAX, Technology and Market Beijing University of Post and Telecommunications 2 WIMAX network architecture. Second, the paper address the issue of market in which provides an overview of development and deployment of WIMAX to end the future development trend of WIMAX is addressed. RESUMEN: Por comunicaciones inalámbricas se entiende la transferencia de información desde un lugar a otro sin la necesidad de un soporte físico como es por ejemplo el cable. Por lo que remontándose a los principios de la existencia del ser humano, nos damos cuenta de que el ser humano siempre ha sentido la necesidad de desarrollar técnicas para lograr comunicarse a distancia con sus semejantes. De dicha necesidad, surgieron técnicas tan ancestrales como puede ser la comunicación mediante señales de humo o por reflexión de los rayos solares en espejos entre otras. La curiosidad del ser humano y la necesidad de comunicarse a distancia fue la que llevó a Alexander Graham Bell a inventar el teléfono en 1876. La aparición de un dispositivo que permitía comunicarse a distancia permitiendo escuchar la voz de aquella persona con la que se quería hablar, supuso una revolución no solo en el panorama tecnológico, si no también en el panorama social. Pues a parte de permitir comunicaciones a larga distancia, solventó el problema de la comunicación en “tiempo real”. A raíz de este invento, la tecnología en materia de comunicación ha ido avanzando significativamente, más concretamente en lo referido a las comunicaciones inalámbricas. En 1973 se realizó la primera llamada desde un terminal móvil aunque no fue hasta 1983 cuando se empezó a comercializar dicho terminal, lo que supuso un cambio de hábitos y costumbres para la sociedad. Desde la aparición del primer móvil el crecimiento del mercado ha sido exponencial, lo que ha repercutido en una demanda impensable de nuevas aplicaciones integradas en dichos dispositivos móviles que satisfagan las necesidades que día a día autogenera la sociedad. Tras conseguir realizar llamadas a larga distancia de forma inalámbrica, el siguiente paso fue la creación de los SMS (Short Message System) lo que supuso una nueva revolución además de abaratar costes al usuario a la hora de comunicarse. Pero el gran reto para la industria de las comunicaciones móviles surgió con la aparición de internet. Todo el mundo sentía la necesidad de poder conectarse a esa gran base de datos que es internet en cualquier parte y en cualquier momento. Las primeras conexiones a internet desde dispositivos móviles se realizaron a través de la tecnología WAP (Wireless Application Protocol) hasta la aparición de la tecnología GPRS que permitía la conexión mediante protocolo TCP/IP. A partir de estas conexiones han surgido otras tecnologías, como EDGE, HSDPA, etc., que permitían y permiten la conexión a internet desde dispositivos móviles. Hoy en día la demanda de servicios de red inalámbrica crece de forma rápida y exponencial, todo el mundo quiere servicios de banda ancha en cualquier lugar y en cualquier momento. En este documento se analiza la tecnología WiMAX ( Worldwide Interoperability for Microwave Access) que es una tecnología de banda ancha basada en el estándar IEEE 802.16 creada para brindar servicios a la demanda emergente en la banda ancha desde un punto de vista tecnológico, donde se da una visión de la parte técnica de la tecnología; y desde el punto de vista del mercado, donde se analiza el despliegue y desarrollo de la tecnología desde el punto de vista de negocio. WiMAX es una tecnología que permite la transmisión inalámbrica de datos en áreas de hasta 48Km de radio y que está diseñada como alternativa inalámbrica para ADSL y para conectar nodos de red inalámbrica en áreas metropolitanas. A diferencia de los sistemas inalámbricos existentes que están limitados en su mayoría a unos cientos de metros, WiMAX ofrece una mayor cobertura y un mayor ancho de banda que permita dar soporte a nuevas aplicaciones, además de alcanzar altas tasas de transmisión de datos en grandes áreas con una gran cantidad de usuarios. Se trata de una alternativa a las redes de acceso de banda ancha como DSL o Wi-Fi, que puede dar acceso de banda ancha a lugares tales como zonas rurales o zonas en vías de desarrollo por todo el mundo con rapidez. Existen dos tecnologías de WiMAX, WiMAX fijo (basado en el estándar IEEE 802.16d-2004) y WiMAX móvil (basado en el estándar IEEE 802.16e-2005). La tecnología fija está diseñada para comunicaciones punto a multipunto, mientras que la fija lo está para comunicaciones multipunto a multipunto. WiMAX móvil se basa en la tecnología OFDM que ofrece ventajas en términos de latencia, eficiencia en el uso del espectro y soporte avanzado para antenas. La modulación OFDM es muy robusta frente al multitrayecto, que es muy habitual en los canales de radiodifusión, frente al desvanecimiento debido a las condiciones meteorológicas y frente a las interferencias de RF. Una vez creada la tecnología WiMAX, poseedora de las características idóneas para solventar la demanda del mercado, ha de darse el siguiente paso, hay que convencer a la industria de las telecomunicaciones de que dicha tecnología realmente es la solución para que apoyen su implantación en el mercado de la banda ancha para las redes inalámbricas. Es aquí donde entra en juego el estudio del mercado que se realiza en este documento. WiMAX se enfrenta a un mercado exigente en el que a parte de tener que dar soporte a la demanda técnica, ha de ofrecer una rentabilidad económica a la industria de las comunicaciones móviles y más concretamente a las operadoras móviles que son quienes dentro del sector de las telecomunicaciones finalmente han de confiar en la tecnología para dar soporte a sus usuarios ya que estos al fin y al cabo lo único que quieren es que su dispositivo móvil satisfaga sus necesidades independientemente de la tecnología que utilicen para tener acceso a la red inalámbrica de banda ancha. Quizás el mayor problema al que se ha enfrentado WiMAX haya sido la situación económica en la que se encuentra el mundo. WiMAX a comenzado su andadura en uno de los peores momentos, pero aun así se presenta como una tecnología capaz de ayudar al mundo a salir hacia delante en estos tiempos tan duros. Finalmente se analiza uno de los debates existentes hoy en día en el sector de las comunicaciones móviles, WiMAX vs. LTE. Como se puede observar en el documento realmente una tecnología no saldrá victoriosa frente a la otra, si no que ambas tecnologías podrán coexistir y trabajar de forma conjunta.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

El estudio de materiales, especialmente biológicos, por medios no destructivos está adquiriendo una importancia creciente tanto en las aplicaciones científicas como industriales. Las ventajas económicas de los métodos no destructivos son múltiples. Existen numerosos procedimientos físicos capaces de extraer información detallada de las superficie de la madera con escaso o nulo tratamiento previo y mínima intrusión en el material. Entre los diversos métodos destacan las técnicas ópticas y las acústicas por su gran versatilidad, relativa sencillez y bajo coste. Esta tesis pretende establecer desde la aplicación de principios simples de física, de medición directa y superficial, a través del desarrollo de los algoritmos de decisión mas adecuados basados en la estadística, unas soluciones tecnológicas simples y en esencia, de coste mínimo, para su posible aplicación en la determinación de la especie y los defectos superficiales de la madera de cada muestra tratando, en la medida de lo posible, no alterar su geometría de trabajo. Los análisis desarrollados han sido los tres siguientes: El primer método óptico utiliza las propiedades de la luz dispersada por la superficie de la madera cuando es iluminada por un laser difuso. Esta dispersión produce un moteado luminoso (speckle) cuyas propiedades estadísticas permiten extraer propiedades muy precisas de la estructura tanto microscópica como macroscópica de la madera. El análisis de las propiedades espectrales de la luz laser dispersada genera ciertos patrones mas o menos regulares relacionados con la estructura anatómica, composición, procesado y textura superficial de la madera bajo estudio que ponen de manifiesto características del material o de la calidad de los procesos a los que ha sido sometido. El uso de este tipo de láseres implica también la posibilidad de realizar monitorizaciones de procesos industriales en tiempo real y a distancia sin interferir con otros sensores. La segunda técnica óptica que emplearemos hace uso del estudio estadístico y matemático de las propiedades de las imágenes digitales obtenidas de la superficie de la madera a través de un sistema de scanner de alta resolución. Después de aislar los detalles mas relevantes de las imágenes, diversos algoritmos de clasificacion automatica se encargan de generar bases de datos con las diversas especies de maderas a las que pertenecían las imágenes, junto con los márgenes de error de tales clasificaciones. Una parte fundamental de las herramientas de clasificacion se basa en el estudio preciso de las bandas de color de las diversas maderas. Finalmente, numerosas técnicas acústicas, tales como el análisis de pulsos por impacto acústico, permiten complementar y afinar los resultados obtenidos con los métodos ópticos descritos, identificando estructuras superficiales y profundas en la madera así como patologías o deformaciones, aspectos de especial utilidad en usos de la madera en estructuras. La utilidad de estas técnicas esta mas que demostrada en el campo industrial aun cuando su aplicación carece de la suficiente expansión debido a sus altos costes y falta de normalización de los procesos, lo cual hace que cada análisis no sea comparable con su teórico equivalente de mercado. En la actualidad gran parte de los esfuerzos de investigación tienden a dar por supuesto que la diferenciación entre especies es un mecanismo de reconocimiento propio del ser humano y concentran las tecnologías en la definición de parámetros físicos (módulos de elasticidad, conductividad eléctrica o acústica, etc.), utilizando aparatos muy costosos y en muchos casos complejos en su aplicación de campo. Abstract The study of materials, especially the biological ones, by non-destructive techniques is becoming increasingly important in both scientific and industrial applications. The economic advantages of non-destructive methods are multiple and clear due to the related costs and resources necessaries. There are many physical processes capable of extracting detailed information on the wood surface with little or no previous treatment and minimal intrusion into the material. Among the various methods stand out acoustic and optical techniques for their great versatility, relative simplicity and low cost. This thesis aims to establish from the application of simple principles of physics, surface direct measurement and through the development of the more appropriate decision algorithms based on statistics, a simple technological solutions with the minimum cost for possible application in determining the species and the wood surface defects of each sample. Looking for a reasonable accuracy without altering their work-location or properties is the main objetive. There are three different work lines: Empirical characterization of wood surfaces by means of iterative autocorrelation of laser speckle patterns: A simple and inexpensive method for the qualitative characterization of wood surfaces is presented. it is based on the iterative autocorrelation of laser speckle patterns produced by diffuse laser illumination of the wood surfaces. The method exploits the high spatial frequency content of speckle images. A similar approach with raw conventional photographs taken with ordinary light would be very difficult. A few iterations of the algorithm are necessary, typically three or four, in order to visualize the most important periodic features of the surface. The processed patterns help in the study of surface parameters, to design new scattering models and to classify the wood species. Fractal-based image enhancement techniques inspired by differential interference contrast microscopy: Differential interference contrast microscopy is a very powerful optical technique for microscopic imaging. Inspired by the physics of this type of microscope, we have developed a series of image processing algorithms aimed at the magnification, noise reduction, contrast enhancement and tissue analysis of biological samples. These algorithms use fractal convolution schemes which provide fast and accurate results with a performance comparable to the best present image enhancement algorithms. These techniques can be used as post processing tools for advanced microscopy or as a means to improve the performance of less expensive visualization instruments. Several examples of the use of these algorithms to visualize microscopic images of raw pine wood samples with a simple desktop scanner are provided. Wood species identification using stress-wave analysis in the audible range: Stress-wave analysis is a powerful and flexible technique to study mechanical properties of many materials. We present a simple technique to obtain information about the species of wood samples using stress-wave sounds in the audible range generated by collision with a small pendulum. Stress-wave analysis has been used for flaw detection and quality control for decades, but its use for material identification and classification is less cited in the literature. Accurate wood species identification is a time consuming task for highly trained human experts. For this reason, the development of cost effective techniques for automatic wood classification is a desirable goal. Our proposed approach is fully non-invasive and non-destructive, reducing significantly the cost and complexity of the identification and classification process.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper examines the implications of strategic rigidness for technology adoption behaviours among electric utilities. Such behaviours lead to heterogeneity in firm performance and consequently affect the electric utility industry. The paper's central aim is to identify and describe the implications of strategic rigidness for a utility firm's decision making in adopting newer renewable energy technologies. The findings indicate that not all utility firms are keen to adopt these new technologies, as these firms have traditionally been operating efficiently with a more conventional and mature technological arrangement that has become embedded in the organisational routine. Case studies of Iberdrola S.A. and Enel S.p.A. as major electric utilities are detailed to document mergers and acquisitions and technology adoption decisions. The results indicate that technology adoption behaviours vary widely across utility firms with different organisational learning processes and core capabilities.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The goal of the W3C's Media Annotation Working Group (MAWG) is to promote interoperability between multimedia metadata formats on the Web. As experienced by everybody, audiovisual data is omnipresent on today's Web. However, different interaction interfaces and especially diverse metadata formats prevent unified search, access, and navigation. MAWG has addressed this issue by developing an interlingua ontology and an associated API. This article discusses the rationale and core concepts of the ontology and API for media resources. The specifications developed by MAWG enable interoperable contextualized and semantic annotation and search, independent of the source metadata format, and connecting multimedia data to the Linked Data cloud. Some demonstrators of such applications are also presented in this article.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Some verification and validation techniques have been evaluated both theoretically and empirically. Most empirical studies have been conducted without subjects, passing over any effect testers have when they apply the techniques. We have run an experiment with students to evaluate the effectiveness of three verification and validation techniques (equivalence partitioning, branch testing and code reading by stepwise abstraction). We have studied how well able the techniques are to reveal defects in three programs. We have replicated the experiment eight times at different sites. Our results show that equivalence partitioning and branch testing are equally effective and better than code reading by stepwise abstraction. The effectiveness of code reading by stepwise abstraction varies significantly from program to program. Finally, we have identified project contextual variables that should be considered when applying any verification and validation technique or to choose one particular technique.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The analysis of complex nonlinear systems is often carried out using simpler piecewise linear representations of them. A principled and practical technique is proposed to linearize and evaluate arbitrary continuous nonlinear functions using polygonal (continuous piecewise linear) models under the L1 norm. A thorough error analysis is developed to guide an optimal design of two kinds of polygonal approximations in the asymptotic case of a large budget of evaluation subintervals N. The method allows the user to obtain the level of linearization (N) for a target approximation error and vice versa. It is suitable for, but not limited to, an efficient implementation in modern Graphics Processing Units (GPUs), allowing real-time performance of computationally demanding applications. The quality and efficiency of the technique has been measured in detail on two nonlinear functions that are widely used in many areas of scientific computing and are expensive to evaluate.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The verification and validation activity plays a fundamental role in improving software quality. Determining which the most effective techniques for carrying out this activity are has been an aspiration of experimental software engineering researchers for years. This paper reports a controlled experiment evaluating the effectiveness of two unit testing techniques (the functional testing technique known as equivalence partitioning (EP) and the control-flow structural testing technique known as branch testing (BT)). This experiment is a literal replication of Juristo et al. (2013).Both experiments serve the purpose of determining whether the effectiveness of BT and EP varies depending on whether or not the faults are visible for the technique (InScope or OutScope, respectively). We have used the materials, design and procedures of the original experiment, but in order to adapt the experiment to the context we have: (1) reduced the number of studied techniques from 3 to 2; (2) assigned subjects to experimental groups by means of stratified randomization to balance the influence of programming experience; (3) localized the experimental materials and (4) adapted the training duration. We ran the replication at the Escuela Politécnica del Ejército Sede Latacunga (ESPEL) as part of a software verification & validation course. The experimental subjects were 23 master?s degree students. EP is more effective than BT at detecting InScope faults. The session/program andgroup variables are found to have significant effects. BT is more effective than EP at detecting OutScope faults. The session/program and group variables have no effect in this case. The results of the replication and the original experiment are similar with respect to testing techniques. There are some inconsistencies with respect to the group factor. They can be explained by small sample effects. The results for the session/program factor are inconsistent for InScope faults.We believe that these differences are due to a combination of the fatigue effect and a technique x program interaction. Although we were able to reproduce the main effects, the changes to the design of the original experiment make it impossible to identify the causes of the discrepancies for sure. We believe that further replications closely resembling the original experiment should be conducted to improve our understanding of the phenomena under study.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

True stress-true strain curves of naturally spun viscid line fibers retrieved directly from the spiral of orb-webs built by Argiope trifasciata spiders were measured using a novel methodology. This new procedure combines a method for removing the aqueous coating of the fibers and a technique that allows the accurate measurement of their cross sectional area. Comparison of the tensile behaviour of different samples indicates that naturally spun viscid lines show a large variability, comparable to that of other silks, such as major ampullate gland silk and silkworm silk. Nevertheless, application of a statistical analysis allowed identifying two independent parameters that underlie the variability and characterize the observed range of true stress-true strain curves. Combination of this result with previous mechanical and microstructural data suggested the assignment of these two independent effects to the degree of alignment of the protein chains and to the local relative humidity which, in turn, depends on the composition of the viscous coating and on the external environmental conditions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of the present study was to identify the importance of floorball tactical variables to predict ball possession effectiveness, when controlling quality of opposition and game periods. The sample was composed by 1500 ball possessions, corresponding to 14 games randomly selected from the International Championships played during 2008 and 2010 (World Championship, Four nations tournament and classificatory phases for World Championship) by teams from different competition levels (HIGH, INTERMEDIATE and LOW). The effects of the predictor variables on successful ball possessions according to the three game contexts (HIGH vs. HIGH; HIGH vs. LOW; LOW vs. LOW games) were analyzed using Binomial Logistic Regressions. The results showed no interaction with the game period. In HIGH vs. HIGH games, quality of opposition showed an association with ball possession effectiveness with ending zone, offensive system, possession duration, height of shooting and defensive pressures previous to the shot. In HIGH vs. LOW games the important factors were the starting zone, possession duration, defensive pressure previous to the last pass and to the shot, technique of shooting and the number players involved in each ball possession. Finally, in LOW vs. LOW games, the results emphasized the importance of starting and ending zones, the number of passes used and the technique of shooting. In conclusion, elite floorball performance is mainly affected by quality of opposition showing different game patterns in each context that should be considered by coaches when preparing practices and competitions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The study of the many types of natural and manmade cavities in different parts of the world is important to the fields of geology, geophysics, engineering, architectures, agriculture, heritages and landscape. Ground-penetrating radar (GPR) is a noninvasive geodetection and geolocation technique suitable for accurately determining buried structures. This technique requires knowing the propagation velocity of electromagnetic waves (EM velocity) in the medium. We propose a method for calibrating the EM velocity using the integration of laser imaging detection and ranging (LIDAR) and GPR techniques using the Global Navigation Satellite System (GNSS) as support for geolocation. Once the EM velocity is known and the GPR profiles have been properly processed and migrated, they will also show the hidden cavities and the old hidden structures from the cellar. In this article, we present a complete study of the joint use of the GPR, LIDAR and GNSS techniques in the characterization of cavities. We apply this methodology to study underground cavities in a group of wine cellars located in Atauta (Soria, Spain). The results serve to identify construction elements that form the cavity and group of cavities or cellars. The described methodology could be applied to other shallow underground structures with surface connection, where LIDAR and GPR profiles could be joined, as, for example, in archaeological cavities, sewerage systems, drainpipes, etc.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The verification and validation activity plays a fundamental role in improving software quality. Determining which the most effective techniques for carrying out this activity are has been an aspiration of experimental software engineering researchers for years. This paper reports a controlled experiment evaluating the effectiveness of two unit testing techniques (the functional testing technique known as equivalence partitioning (EP) and the control-flow structural testing technique known as branch testing (BT)). This experiment is a literal replication of Juristo et al. (2013). Both experiments serve the purpose of determining whether the effectiveness of BT and EP varies depending on whether or not the faults are visible for the technique (InScope or OutScope, respectively). We have used the materials, design and procedures of the original experiment, but in order to adapt the experiment to the context we have: (1) reduced the number of studied techniques from 3 to 2; (2) assigned subjects to experimental groups by means of stratified randomization to balance the influence of programming experience; (3) localized the experimental materials and (4) adapted the training duration. We ran the replication at the Escuela Polite?cnica del Eje?rcito Sede Latacunga (ESPEL) as part of a software verification & validation course. The experimental subjects were 23 master?s degree students. EP is more effective than BT at detecting InScope faults. The session/program and group variables are found to have significant effects. BT is more effective than EP at detecting OutScope faults. The session/program and group variables have no effect in this case. The results of the replication and the original experiment are similar with respect to testing techniques. There are some inconsistencies with respect to the group factor. They can be explained by small sample effects. The results for the session/program factor are inconsistent for InScope faults. We believe that these differences are due to a combination of the fatigue effect and a technique x program interaction. Although we were able to reproduce the main effects, the changes to the design of the original experiment make it impossible to identify the causes of the discrepancies for sure. We believe that further replications closely resembling the original experiment should be conducted to improve our understanding of the phenomena under study.