954 resultados para Electromyography analysis techniques
Resumo:
The present study focuses on single-case data analysis and specifically on two procedures for quantifying differences between baseline and treatment measurements The first technique tested is based on generalized least squares regression analysis and is compared to a proposed non-regression technique, which allows obtaining similar information. The comparison is carried out in the context of generated data representing a variety of patterns (i.e., independent measurements, different serial dependence underlying processes, constant or phase-specific autocorrelation and data variability, different types of trend, and slope and level change). The results suggest that the two techniques perform adequately for a wide range of conditions and researchers can use both of them with certain guarantees. The regression-based procedure offers more efficient estimates, whereas the proposed non-regression procedure is more sensitive to intervention effects. Considering current and previous findings, some tentative recommendations are offered to applied researchers in order to help choosing among the plurality of single-case data analysis techniques.
Resumo:
The characterization and categorization of coarse aggregates for use in portland cement concrete (PCC) pavements is a highly refined process at the Iowa Department of Transportation. Over the past 10 to 15 years, much effort has been directed at pursuing direct testing schemes to supplement or replace existing physical testing schemes. Direct testing refers to the process of directly measuring the chemical and mineralogical properties of an aggregate and then attempting to correlate those measured properties to historical performance information (i.e., field service record). This is in contrast to indirect measurement techniques, which generally attempt to extrapolate the performance of laboratory test specimens to expected field performance. The purpose of this research project was to investigate and refine the use of direct testing methods, such as X-ray analysis techniques and thermal analysis techniques, to categorize carbonate aggregates for use in portland cement concrete. The results of this study indicated that the general testing methods that are currently used to obtain data for estimating service life tend to be very reliable and have good to excellent repeatability. Several changes in the current techniques were recommended to enhance the long-term reliability of the carbonate database. These changes can be summarized as follows: (a) Limits that are more stringent need to be set on the maximum particle size in the samples subjected to testing. This should help to improve the reliability of all three of the test methods studied during this project. (b) X-ray diffraction testing needs to be refined to incorporate the use of an internal standard. This will help to minimize the influence of sample positioning errors and it will also allow for the calculation of the concentration of the various minerals present in the samples. (c) Thermal analysis data needs to be corrected for moisture content and clay content prior to calculating the carbonate content of the sample.
Resumo:
The interrelation of curing time, curing temperature, strength, and reactions in lime-bentonite-water mixtures was examined. Samples were molded at constant density and moisture content and then cured for periods of from 1 to 56 days at constant temperatures that ranged from 5C to 60C. After the appropriate curing time the samples were tested for unconfined compressive strength. The broken samples were then analyzed by x-ray diffractometer and spectrophotometer to determine the identity of the reaction products present after each curing period. It was found that the strength gain of lime-clay mixtures cured at different temperatures is due to different phases of the complex reaction, lime & clay to CSH(gel) to CSH(II) to CSH(I) to tobermorite. The farther the reaction proceeds, the higher the strength. There was also evidence of lattice substitutions in the structure of the calcium silicate hydrates at curing temperatures of 50C and higher. No consistent relationship between time, temperature, strength, and the S/A ration of reaction products existed, but in order to achieve high strengths the apparent C/S ration had to be less than two. The curing temperature had an effect on the strength developed by a given amount of reacted silica in the cured lime-clay mixture, but at a given curing temperature the cured sample that had the largest amount of reacted silica gave the highest strength. Evidence was found to indicate that during the clay reaction some calcium is indeed adsorbed onto the clay structure rather than entering into a pozzolanic reaction. Finally, it was determined that it is possible to determine the amount of silica and alumina in lime-clay reaction products by spectrophotometric analysis with sufficient accuracy for comparison purposes. The spectrophotometric analysis techniques used during the investigation were simple and were not time consuming.
Resumo:
The European Space Agency's Gaia mission will create the largest and most precise three dimensional chart of our galaxy (the Milky Way), by providing unprecedented position, parallax, proper motion, and radial velocity measurements for about one billion stars. The resulting catalogue will be made available to the scientific community and will be analyzed in many different ways, including the production of a variety of statistics. The latter will often entail the generation of multidimensional histograms and hypercubes as part of the precomputed statistics for each data release, or for scientific analysis involving either the final data products or the raw data coming from the satellite instruments. In this paper we present and analyze a generic framework that allows the hypercube generation to be easily done within a MapReduce infrastructure, providing all the advantages of the new Big Data analysis paradigmbut without dealing with any specific interface to the lower level distributed system implementation (Hadoop). Furthermore, we show how executing the framework for different data storage model configurations (i.e. row or column oriented) and compression techniques can considerably improve the response time of this type of workload for the currently available simulated data of the mission. In addition, we put forward the advantages and shortcomings of the deployment of the framework on a public cloud provider, benchmark against other popular solutions available (that are not always the best for such ad-hoc applications), and describe some user experiences with the framework, which was employed for a number of dedicated astronomical data analysis techniques workshops.
Resumo:
The European Space Agency's Gaia mission will create the largest and most precise three dimensional chart of our galaxy (the Milky Way), by providing unprecedented position, parallax, proper motion, and radial velocity measurements for about one billion stars. The resulting catalogue will be made available to the scientific community and will be analyzed in many different ways, including the production of a variety of statistics. The latter will often entail the generation of multidimensional histograms and hypercubes as part of the precomputed statistics for each data release, or for scientific analysis involving either the final data products or the raw data coming from the satellite instruments. In this paper we present and analyze a generic framework that allows the hypercube generation to be easily done within a MapReduce infrastructure, providing all the advantages of the new Big Data analysis paradigmbut without dealing with any specific interface to the lower level distributed system implementation (Hadoop). Furthermore, we show how executing the framework for different data storage model configurations (i.e. row or column oriented) and compression techniques can considerably improve the response time of this type of workload for the currently available simulated data of the mission. In addition, we put forward the advantages and shortcomings of the deployment of the framework on a public cloud provider, benchmark against other popular solutions available (that are not always the best for such ad-hoc applications), and describe some user experiences with the framework, which was employed for a number of dedicated astronomical data analysis techniques workshops.
Resumo:
The major objective of this research project is to investigate the chemistry and morphology of Portland cement concrete pavements in Iowa. The integrity of the various pavements is being ascertained based on the presence or absence of microcracks, the presence or absence of sulfate minerals, and the presence or absence of alkali-silica gel(s). Work is also being done on quantifying the air content of the concrete using image analysis techniques since this often appears to be directly related to the sulfate minerals that are commonly observed in the pavement cores.
Resumo:
Työssä selvitettiin värähtelymittauksien käyttömahdollisuuksia nosturivaihteiden ennakoivassa kunnonvalvonnassa. Työ on painottunut lähinnä värähtelymittausten mittaus- ja analyysiteknisten näkökohtien tarkasteluun ja toimii niiltä osin al-kuselvityksenä värähtelymittauspohjaisen kunnonvalvontajärjestelmän luomiselle. Kirjallisuuden mukaan yleisimpiä ja käyttökelpoisimpia värähtelyanalyysejä vaihteistojen kunnonvalvonnassa ovat spektri-, kepstri-, ja demodulointitekniik-kaan perustuvat analyysit, joita kokeiltiin vauriokokeilla. Työssä käsitellään myös iskusysäysmenetelmää ja aikatason tunnuslukuja. Vauriokokeilla saatiin tietoa hammasvaurioiden havaitsemisesta värähtelymittauksilla, mutta laakerivaurioista kaivataan sen sijaan vielä lisää tietoa. Hammasvaurioiden tarkastelussa spektri-analyysin ohella PeakVue -analyysi osoittautui tehokkaaksi, kun siinä käytettiin sopivaa suodatusaluetta. Lisäksi työhön on koottu kokemuksia nosturivaihteiden käytännön värähtelymittauksista ja annettu ohjeita niiden suorittamisesta. Työn tulokset osoittavat, että värähtelymittauspohjaisella kunnonvalvontajärjes-telmällä voidaan parantaa nostovaihteiden ennakoivaa kunnonvalvontaa. Nosto-vaihteiden värähtelymittauksia kannattaa siis ehdottomasti jatkaa ja kunnonval-vontajärjestelmää edelleen kehittää. Siirtovaihteiden valvonta värähtelymittauksilla on sen sijaan hankalaa niiden epävakaan kuormitustilan vuoksi.
Resumo:
This review presents the evolution of simultaneous multicomponent analysis by absorption spectrophotometry in the ultraviolet and visual regions in terms of some qualitative and quantitative analysis techniques, otimization methods, as well as applications and modern trends.
Resumo:
The characterization of rice husk ash, a deriving by-product of the burning of the rice husk during the rice processing is the object of this study. This by-product, for being rich in silica, can be an important raw material for the production of siliceous ceramics, such as thermal insulators and refractory. A combination of surface analysis, thermal analysis and microscopy analysis techniques was used for the characterization. The characterized by-product presented as main component the silica, under amorphous form, with a maximum content of alkalis around 1%, features that become it potentially interesting for the production of ceramic materials.
Resumo:
A brief discussion about the hydrogen peroxide importance and its determination is presented. It was emphasized some consideration of the H2O2 as reagent (separated or combined), uses and methods of analysis (techniques, detection limits, linear response intervals, sensor specifications). Moreover, it was presented several applications, such as in environmental, pharmaceutical, medicine and food samples.
Resumo:
The concentrations of Cu, Pb, Zn, Cr, Ni, Al, Mn and Fe were measured by atomic absorption spectrometry, of 19 topsoil samples collected in the Teresina city urban area to discriminate natural and anthropic contributions and identify possible sources of pollution. The average concentrations of Cu, Zn, Pb and Cr of the urban soils were 6.11, 8.56, 32.12 and 7,17 mg/kg-1, respectively. Statistical analysis techniques, such as principal component analysis (PCA) and hierarchical cluster analysis (HCA), were used to analyze the data. Mn, Ni and Cr levels were interpreted as natural contributions, whereas Pb, Zn and, in part, Cu were accounted for mainly by anthropic activities. High Pb levels were observed in the ancient avenues.
Resumo:
This paper shows different aspects related to the application of different thermal analysis techniques in the study of energetic materials. The criteria used to choose the best technique and an exact approach to adjust the experimental data with a proper model are here discussed. The paper shows how to use the different thermal analysis results to help develop new compounds, to study the stability of some energetic materials and their compatibility, and the conditions necessary for a secure storing environment.
Resumo:
L’objectiu del present TFM és explorar les possibilitats del programa matemàtic MATLAB i la seva eina Entorn de Disseny d’Interfícies Gràfiques d’Usuari (GUIDE), desenvolupant un programa d’anàlisi d’imatges de provetes metal·logràfiques que es pugui utilitzar per a realitzar pràctiques de laboratori de l’assignatura Tecnologia de Materials de la titulació de Grau en Enginyeria Mecatrònica que s’imparteix a la Universitat de Vic. Les àrees d’interès del treball són la Instrumentació Virtual, la programació MATLAB i les tècniques d’anàlisi d’imatges metal·logràfiques. En la memòria es posa un èmfasi especial en el disseny de la interfície i dels procediments per a efectuar les mesures. El resultat final és un programa que satisfà tots els requeriments que s’havien imposat en la proposta inicial. La interfície del programa és clara i neta, destinant molt espai a la imatge que s’analitza. L’estructura i disposició dels menús i dels comandaments ajuda a que la utilització del programa sigui fàcil i intuïtiva. El programa s’ha estructurat de manera que sigui fàcilment ampliable amb altres rutines de mesura, o amb l’automatització de les rutines existents. Al tractar-se d’un programa que funciona com un instrument de mesura, es dedica un capítol sencer de la memòria a mostrar el procediment de càlcul dels errors que s’ocasionen durant la seva utilització, amb la finalitat de conèixer el seu ordre de magnitud, i de saber-los calcular de nou en cas que variïn les condicions d’utilització. Pel que fa referència a la programació, malgrat que MATLAB no sigui un entorn de programació clàssic, sí que incorpora eines que permeten fer aplicacions no massa complexes, i orientades bàsicament a gràfics o a imatges. L’eina GUIDE simplifica la realització de la interfície d’usuari, malgrat que presenta problemes per tractar dissenys una mica complexos. Per altra banda, el codi generat per GUIDE no és accessible, cosa que no permet modificar manualment la interfície en aquells casos en els que GUIDE té problemes. Malgrat aquests petits problemes, la potència de càlcul de MATLAB compensa sobradament aquestes deficiències.
Resumo:
Las técnicas de análisis forense digital, de aplicación en investigación criminal, también se pueden usar en las bibliotecas para acceder a información digital almacenada en soportes o formatos obsoletos. Se analizan distintos ejemplos de departamentos de análisis forense creados por bibliotecas y se describen los elementos de hardware y software mínimos con los que se podría montar una unidad de análisis forense en cualquier biblioteca. Con este fin se presentan dos posibles configuraciones de equipamiento y se dan recomendaciones sobre organización del flujo de trabajo para la recuperación de antiguos discos duros y disquetes. Forensic analysis techniques, usually applied in criminal research, could also be used in libraries to access digital information stored in obsolete formats or storage devices. This article analyses some examples of forensic research departments created by libraries, and describes the minimal hardware and software elements required to set up a library unit specialized in forensic analysis. Two possible equipment settings are introduced and recommendations are given on how to organize a workflow to recover information stored in floppy disks, diskettes and old hard drives.
Resumo:
Automotive gasoline consists of a complex mixture of flammable and volatile hydrocarbons derived from crude oil with carbon numbers within the range of 4-12 and boiling points range of 30-225 ºC. Its composition varies with the kind of crude oil and the type of refinery process that they undergone. Aromatics hydrocarbons, in particular benzene, toluene, ethylbenzene and isomeric xylenes (BTEX) are the toxic group constituents presents. GC-FID was employed to quantify these hydrocarbons in 50 commercial gasoline samples from Piauí state. Statistical analysis techniques, such as PCA and HCA were used to analyze the data. Moreover, several validation parameters were evaluated.