974 resultados para Matlab®


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Para este trabajo se ha desarrollado un programa en Matlab, que nos permite realizar ensayos con algunas de las herramientas fundamentales del análisis técnico. Concretamente nos hemos centrado en el “Indicador de Movimiento Direccional” de Wilder. El programa está formado por seis funciones que permiten descargar datos, hacer la simulación del indicador, ajustar automáticamente algunos de sus parámetros y presentar los resultados obtenidos en la simulación. Con los experimentos y simulaciones realizadas se ha visto la importancia de escoger adecuadamente los períodos de ±DIs (indicadores direccionales positivo y negativo) y el ADX (Average Directional Movement Index). También hemos visto que la reglas decisión apuntadas por autores de reconocido prestigio como Cava y Ortiz ,no siempre se comportan como cabría esperar. Se propone mejorar el rendimiento y la fiabilidad de este indicador Incluyendo alguna media móvil de los precios y el volumen de contratación, en los criterios de decisión. También se podría mejorar implementando un sistema para que se pudiesen autoajustar los criterios de decisión.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

L’objectiu del present TFM és explorar les possibilitats del programa matemàtic MATLAB i la seva eina Entorn de Disseny d’Interfícies Gràfiques d’Usuari (GUIDE), desenvolupant un programa d’anàlisi d’imatges de provetes metal·logràfiques que es pugui utilitzar per a realitzar pràctiques de laboratori de l’assignatura Tecnologia de Materials de la titulació de Grau en Enginyeria Mecatrònica que s’imparteix a la Universitat de Vic. Les àrees d’interès del treball són la Instrumentació Virtual, la programació MATLAB i les tècniques d’anàlisi d’imatges metal·logràfiques. En la memòria es posa un èmfasi especial en el disseny de la interfície i dels procediments per a efectuar les mesures. El resultat final és un programa que satisfà tots els requeriments que s’havien imposat en la proposta inicial. La interfície del programa és clara i neta, destinant molt espai a la imatge que s’analitza. L’estructura i disposició dels menús i dels comandaments ajuda a que la utilització del programa sigui fàcil i intuïtiva. El programa s’ha estructurat de manera que sigui fàcilment ampliable amb altres rutines de mesura, o amb l’automatització de les rutines existents. Al tractar-se d’un programa que funciona com un instrument de mesura, es dedica un capítol sencer de la memòria a mostrar el procediment de càlcul dels errors que s’ocasionen durant la seva utilització, amb la finalitat de conèixer el seu ordre de magnitud, i de saber-los calcular de nou en cas que variïn les condicions d’utilització. Pel que fa referència a la programació, malgrat que MATLAB no sigui un entorn de programació clàssic, sí que incorpora eines que permeten fer aplicacions no massa complexes, i orientades bàsicament a gràfics o a imatges. L’eina GUIDE simplifica la realització de la interfície d’usuari, malgrat que presenta problemes per tractar dissenys una mica complexos. Per altra banda, el codi generat per GUIDE no és accessible, cosa que no permet modificar manualment la interfície en aquells casos en els que GUIDE té problemes. Malgrat aquests petits problemes, la potència de càlcul de MATLAB compensa sobradament aquestes deficiències.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

La interacció home-màquina per mitjà de la veu cobreix moltes àrees d’investigació. Es destaquen entre altres, el reconeixement de la parla, la síntesis i identificació de discurs, la verificació i identificació de locutor i l’activació per veu (ordres) de sistemes robòtics. Reconèixer la parla és natural i simple per a les persones, però és un treball complex per a les màquines, pel qual existeixen diverses metodologies i tècniques, entre elles les Xarxes Neuronals. L’objectiu d’aquest treball és desenvolupar una eina en Matlab per al reconeixement i identificació de paraules pronunciades per un locutor, entre un conjunt de paraules possibles, i amb una bona fiabilitat dins d’uns marges preestablerts. El sistema és independent del locutor que pronuncia la paraula, és a dir, aquest locutor no haurà intervingut en el procés d’entrenament del sistema. S’ha dissenyat una interfície que permet l’adquisició del senyal de veu i el seu processament mitjançant xarxes neuronals i altres tècniques. Adaptant una part de control al sistema, es podria utilitzar per donar ordres a un robot com l’Alfa6Uvic o qualsevol altre dispositiu.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introduction : Depuis 2005, le « Test FIFA » est utilisé chez les arbitres de football, comme critère de sélection pour monter dans les échelons de l'arbitrage et chaque arbitre base son entraînement dans cet objectif. Ce test a été développé grâce aux nombreux travaux scientifiques, ayant utilisé l'analyse vidéo, afin de quantifier les activités de match des arbitres et analyser leur performance en cours de match. Objectifs : Le but de ce travail a été d'évaluer la performance de l'arbitre, lors d'un match de football, au moyen d'un accéléromètre en raison de sa facilité d'utilisation et en particulier d'évaluer si au cours du match, il existe une éventuelle diminution de la capacité de performance engendrée par la fatigue. Enfin, à la lumière des résultats, nous avons pu discuter du bien fondé du «test par intervalle proposé par la FIFA» comme moyen d'estimation de la capacité physique d'un arbitre. Méthode : Il s'agit d'une étude prospective basée sur une analyse descriptive. Les données ont été récoltées dans des stades de football suisses ≥1ère Ligue, du 01.12.2011 au 01.12.2012. Le groupe étudié était composé de 5 arbitres de football de sexe masculin, dont deux officiant en 1ère Ligue et faisant partie des talents de l'Association Cantonale Vaudoise de Football (ACVF) et trois en Super League et Challenge League. Les 5 arbitres ont été équipés d'un iPhone 3GS®, muni d'une application, capable d'enregistrer les déplacements sur le terrain (arrêt, marche et course). Le traitement des données a été effectué par un programme Matlab®, élaboré par le Laboratoire des Mesures d'Analyse du Mouvement (LMAM) de l'EPFL, tout comme l'application en question. Pour ce travail ont été considérées les phases et les fréquences d'arrêt, de marche et de course tout au long de l'évolution de la partie. Résultats : Durant les 90 minutes du match, la répartition se fait de la manière suivante : 13,74% du temps total où l'accéléromètre ne mesure aucune activité, 33,70% concernent une activité de course alors que le reste, 52,48% est de la marche. Avec l'avancement dans le match, il est constaté une augmentation des phases d'arrêt et une diminution du temps de course. Une intensité d'effort plus importante est observée lors des 15 premières minutes du match (environ  41,7% de course), alors qu'en fin de la partie, il y a une alternance de marche et de course avec des efforts de plus en plus brefs. La détermination de la médiane de durée des différents efforts a montré qu'un épisode de marche ou de course étaient de 5-6 secondes. De plus, les épisodes de marche ou de course étaient rarement >20 secondes. Discussion : Les résultats montrent que l'accéléromètre est un système de mesure facile d'utilisation, permettant un gain de temps dans l'analyse des données pour évaluer la performance sportive. Les principaux résultats de cette étude, ont mis en évidence une diminution de l'intensité des activités physiques de l'arbitre avec l'avancement du match, résultant soit de sa propre fatigue, soit de celle des joueurs dictant le rythme du jeu. Cette diminution se traduit par des déplacements de plus en plus brefs au fil du temps. La mesure de médiane du temps de course et de marche (5-6 sec) correspond à une activité aérobie pour la marche et anaérobie alactique pour la course. Par conséquent, le « test par intervalle » de la FIFA actuel ne nous semble pas adéquat en raison de sa filière énergétique de type anaérobique lactique. Conclusion : Cette étude pilote apporte un nouveau type d'instrumentation efficace et simple, jamais employé auparavant dans l'analyse des activités de match des arbitres de football. Il permet d'explorer des mouvements avec précision au fil du match et apporte un nouvel aspect sur la quantification de performance des arbitres non exploré jusqu'ici. Après analyse de l'ensemble des paramètres, il semble que le test FIFA ne soit pas adapté à la performance exigée par l'arbitrage.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The coverage and volume of geo-referenced datasets are extensive and incessantly¦growing. The systematic capture of geo-referenced information generates large volumes¦of spatio-temporal data to be analyzed. Clustering and visualization play a key¦role in the exploratory data analysis and the extraction of knowledge embedded in¦these data. However, new challenges in visualization and clustering are posed when¦dealing with the special characteristics of this data. For instance, its complex structures,¦large quantity of samples, variables involved in a temporal context, high dimensionality¦and large variability in cluster shapes.¦The central aim of my thesis is to propose new algorithms and methodologies for¦clustering and visualization, in order to assist the knowledge extraction from spatiotemporal¦geo-referenced data, thus improving making decision processes.¦I present two original algorithms, one for clustering: the Fuzzy Growing Hierarchical¦Self-Organizing Networks (FGHSON), and the second for exploratory visual data analysis:¦the Tree-structured Self-organizing Maps Component Planes. In addition, I present¦methodologies that combined with FGHSON and the Tree-structured SOM Component¦Planes allow the integration of space and time seamlessly and simultaneously in¦order to extract knowledge embedded in a temporal context.¦The originality of the FGHSON lies in its capability to reflect the underlying structure¦of a dataset in a hierarchical fuzzy way. A hierarchical fuzzy representation of¦clusters is crucial when data include complex structures with large variability of cluster¦shapes, variances, densities and number of clusters. The most important characteristics¦of the FGHSON include: (1) It does not require an a-priori setup of the number¦of clusters. (2) The algorithm executes several self-organizing processes in parallel.¦Hence, when dealing with large datasets the processes can be distributed reducing the¦computational cost. (3) Only three parameters are necessary to set up the algorithm.¦In the case of the Tree-structured SOM Component Planes, the novelty of this algorithm¦lies in its ability to create a structure that allows the visual exploratory data analysis¦of large high-dimensional datasets. This algorithm creates a hierarchical structure¦of Self-Organizing Map Component Planes, arranging similar variables' projections in¦the same branches of the tree. Hence, similarities on variables' behavior can be easily¦detected (e.g. local correlations, maximal and minimal values and outliers).¦Both FGHSON and the Tree-structured SOM Component Planes were applied in¦several agroecological problems proving to be very efficient in the exploratory analysis¦and clustering of spatio-temporal datasets.¦In this thesis I also tested three soft competitive learning algorithms. Two of them¦well-known non supervised soft competitive algorithms, namely the Self-Organizing¦Maps (SOMs) and the Growing Hierarchical Self-Organizing Maps (GHSOMs); and the¦third was our original contribution, the FGHSON. Although the algorithms presented¦here have been used in several areas, to my knowledge there is not any work applying¦and comparing the performance of those techniques when dealing with spatiotemporal¦geospatial data, as it is presented in this thesis.¦I propose original methodologies to explore spatio-temporal geo-referenced datasets¦through time. Our approach uses time windows to capture temporal similarities and¦variations by using the FGHSON clustering algorithm. The developed methodologies¦are used in two case studies. In the first, the objective was to find similar agroecozones¦through time and in the second one it was to find similar environmental patterns¦shifted in time.¦Several results presented in this thesis have led to new contributions to agroecological¦knowledge, for instance, in sugar cane, and blackberry production.¦Finally, in the framework of this thesis we developed several software tools: (1)¦a Matlab toolbox that implements the FGHSON algorithm, and (2) a program called¦BIS (Bio-inspired Identification of Similar agroecozones) an interactive graphical user¦interface tool which integrates the FGHSON algorithm with Google Earth in order to¦show zones with similar agroecological characteristics.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Aim: When planning SIRT using 90Y microspheres, the partition model is used to refine the activity calculated by the body surface area (BSA) method to potentially improve the safety and efficacy of treatment. For this partition model dosimetry, accurate determination of mean tumor-to-normal liver ratio (TNR) is critical since it directly impacts absorbed dose estimates. This work aimed at developing and assessing a reliable methodology for the calculation of 99mTc-MAA SPECT/CT-derived TNR ratios based on phantom studies. Materials and methods: IQ NEMA (6 hot spheres) and Kyoto liver phantoms with different hot/background activity concentration ratios were imaged on a SPECT/CT (GE Infinia Hawkeye 4). For each reconstruction with the IQ phantom, TNR quantification was assessed in terms of relative recovery coefficients (RC) and image noise was evaluated in terms of coefficient of variation (COV) in the filled background. RCs were compared using OSEM with Hann, Butterworth and Gaussian filters, as well as FBP reconstruction algorithms. Regarding OSEM, RCs were assessed by varying different parameters independently, such as the number of iterations (i) and subsets (s) and the cut-off frequency of the filter (fc). The influence of the attenuation and diffusion corrections was also investigated. Furthermore, both 2D-ROIs and 3D-VOIs contouring were compared. For this purpose, dedicated Matlab© routines were developed in-house for automatic 2D-ROI/3D-VOI determination to reduce intra-user and intra-slice variability. Best reconstruction parameters and RCs obtained with the IQ phantom were used to recover corrected TNR in case of the Kyoto phantom for arbitrary hot-lesion size. In addition, we computed TNR volume histograms to better assess uptake heterogeneityResults: The highest RCs were obtained with OSEM (i=2, s=10) coupled with the Butterworth filter (fc=0.8). Indeed, we observed a global 20% RC improvement over other OSEM settings and a 50% increase as compared to the best FBP reconstruction. In any case, both attenuation and diffusion corrections must be applied, thus improving RC while preserving good image noise (COV<10%). Both 2D-ROI and 3D-VOI analysis lead to similar results. Nevertheless, we recommend using 3D-VOI since tumor uptake regions are intrinsically 3D. RC-corrected TNR values lie within 17% around the true value, substantially improving the evaluation of small volume (<15 mL) regions. Conclusions: This study reports the multi-parameter optimization of 99mTc MAA SPECT/CT images reconstruction in planning 90Y dosimetry for SIRT. In phantoms, accurate quantification of TNR was obtained using OSEM coupled with Butterworth and RC correction.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Underbody plows can be very useful tools in winter maintenance, especially when compacted snow or hard ice must be removed from the roadway. By the application of significant down-force, and the use of an appropriate cutting edge angle, compacted snow and ice can be removed very effectively by such plows, with much greater efficiency than any other tool under those circumstances. However, the successful operation of an underbody plow requires considerable skill. If too little down pressure is applied to the plow, then it will not cut the ice or compacted snow. However, if too much force is applied, then either the cutting edge may gouge the road surface, causing significant damage often to both the road surface and the plow, or the plow may ride up on the cutting edge so that it is no longer controllable by the operator. Spinning of the truck in such situations is easily accomplished. Further, excessive down force will result in rapid wear of the cutting edge. Given this need for a high level of operator skill, the operation of an underbody plow is a candidate for automation. In order to successfully automate the operation of an underbody plow, a control system must be developed that follows a set of rules that represent appropriate operation of such a plow. These rules have been developed, based upon earlier work in which operational underbody plows were instrumented to determine the loading upon them (both vertical and horizontal) and the angle at which the blade was operating.These rules have been successfully coded into two different computer programs, both using the MatLab® software. In the first program, various load and angle inputs are analyzed to determine when, whether, and how they violate the rules of operation. This program is essentially deterministic in nature. In the second program, the Simulink® package in the MatLab® software system was used to implement these rules using fuzzy logic. Fuzzy logic essentially replaces a fixed and constant rule with one that varies in such a way as to improve operational control. The development of the fuzzy logic in this simulation was achieved simply by using appropriate routines in the computer software, rather than being developed directly. The results of the computer testing and simulation indicate that a fully automated, computer controlled underbody plow is indeed possible. The issue of whether the next steps toward full automation should be taken (and by whom) has also been considered, and the possibility of some sort of joint venture between a Department of Transportation and a vendor has been suggested.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Volumes of data used in science and industry are growing rapidly. When researchers face the challenge of analyzing them, their format is often the first obstacle. Lack of standardized ways of exploring different data layouts requires an effort each time to solve the problem from scratch. Possibility to access data in a rich, uniform manner, e.g. using Structured Query Language (SQL) would offer expressiveness and user-friendliness. Comma-separated values (CSV) are one of the most common data storage formats. Despite its simplicity, with growing file size handling it becomes non-trivial. Importing CSVs into existing databases is time-consuming and troublesome, or even impossible if its horizontal dimension reaches thousands of columns. Most databases are optimized for handling large number of rows rather than columns, therefore, performance for datasets with non-typical layouts is often unacceptable. Other challenges include schema creation, updates and repeated data imports. To address the above-mentioned problems, I present a system for accessing very large CSV-based datasets by means of SQL. It's characterized by: "no copy" approach - data stay mostly in the CSV files; "zero configuration" - no need to specify database schema; written in C++, with boost [1], SQLite [2] and Qt [3], doesn't require installation and has very small size; query rewriting, dynamic creation of indices for appropriate columns and static data retrieval directly from CSV files ensure efficient plan execution; effortless support for millions of columns; due to per-value typing, using mixed text/numbers data is easy; very simple network protocol provides efficient interface for MATLAB and reduces implementation time for other languages. The software is available as freeware along with educational videos on its website [4]. It doesn't need any prerequisites to run, as all of the libraries are included in the distribution package. I test it against existing database solutions using a battery of benchmarks and discuss the results.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The physical disector is a method of choice for estimating unbiased neuron numbers; nevertheless, calibration is needed to evaluate each counting method. The validity of this method can be assessed by comparing the estimated cell number with the true number determined by a direct counting method in serial sections. We reconstructed a 1/5 of rat lumbar dorsal root ganglia taken from two experimental conditions. From each ganglion, images of 200 adjacent semi-thin sections were used to reconstruct a volumetric dataset (stack of voxels). On these stacks the number of sensory neurons was estimated and counted respectively by physical disector and direct counting methods. Also, using the coordinates of nuclei from the direct counting, we simulate, by a Matlab program, disector pairs separated by increasing distances in a ganglion model. The comparison between the results of these approaches clearly demonstrates that the physical disector method provides a valid and reliable estimate of the number of sensory neurons only when the distance between the consecutive disector pairs is 60 microm or smaller. In these conditions the size of error between the results of physical disector and direct counting does not exceed 6%. In contrast when the distance between two pairs is larger than 60 microm (70-200 microm) the size of error increases rapidly to 27%. We conclude that the physical dissector method provides a reliable estimate of the number of rat sensory neurons only when the separating distance between the consecutive dissector pairs is no larger than 60 microm.

Relevância:

10.00% 10.00%

Publicador:

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: oscillatory activity, which can be separated in background and oscillatory burst pattern activities, is supposed to be representative of local synchronies of neural assemblies. Oscillatory burst events should consequently play a specific functional role, distinct from background EEG activity – especially for cognitive tasks (e.g. working memory tasks), binding mechanisms and perceptual dynamics (e.g. visual binding), or in clinical contexts (e.g. effects of brain disorders). However extracting oscillatory events in single trials, with a reliable and consistent method, is not a simple task. Results: in this work we propose a user-friendly stand-alone toolbox, which models in a reasonable time a bump time-frequency model from the wavelet representations of a set of signals. The software is provided with a Matlab toolbox which can compute wavelet representations before calling automatically the stand-alone application. Conclusion: The tool is publicly available as a freeware at the address: http:// www.bsp.brain.riken.jp/bumptoolbox/toolbox_home.html

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of this work was to select an appropriate digital filter for a servo application and to filter the noise from the measurement devices. Low pass filter attenuates the high frequency noise beyond the specified cut-off frequency. Digital lowpass filters in both IIR and FIR responses were designed and experimentally compared to understand their characteristics from the corresponding step responses of the system. Kaiser Windowing and Equiripple methods were selected for FIR response, whereas Butterworth, Chebyshev, InverseChebyshev and Elliptic methods were designed for IIR case. Limitations in digital filter design for a servo system were analysed. Especially the dynamic influences of each designed filter on the control stabilityof the electrical servo drive were observed. The criterion for the selection ofparameters in designing digital filters for servo systems was studied. Control system dynamics was given significant importance and the use of FIR and IIR responses in different situations were compared to justify the selection of suitableresponse in each case. The software used in the filter design was MatLab/Simulink® and dSPACE's DSP application. A speed controlled Permanent Magnet Linear synchronous Motor was used in the experimental work.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Yleisesti tiedetään hitsin pintageometrian vaikuttavan rakenteen väsymislujuuteen. Nopean, edullisen ja luotettavan pintageometrian mittausmenetelmän kehittäminen on askel kohti tarkempaa ja varmempaa rakenteen väsymislujuuden tarkastelua. Tässä työssä on tutkittu hitsejä, joiden pinnan geometria on mitattu norjalaisen SINTEF -yrityksen kehittämällä rakenteellisen valon menetelmällä. Osana työtä kehitettiin MatLab -pohjainen ohjelma, jolla jälkikäsitellään mittauksesta saadut x-y-z -mittapisteet. Mittausdatan jälkikäsittelyssä saadaan mittauksesta määritettyähitsin reunan pyöristys, liittymäkulma, a-mitta, reunahaava ja kateettisuhde. Kehitettyä menetelmää käyttämällä mitattiin lähes 300 voimaakantamatontaristiliitoksen hitsiä. Mittaustuloksia verrattiin vastaavista kappaleista tehtyihin hiemittauksiin. Manuaalisen hieestä tehdyn mittauksen havaittiin olevan tarkempi ja pystyttiin havaitsemaan paikallisempia muotoja. Rakenteellisen valon mittauksissa tapahtunut heijastelu saatiin pienenemään käsittelemällä mitattava pinta mattavalkoisella maalilla. Rakenteellisen valon mittatarkkuudeksi saatiin noin 0,2 mm. Pohjautuen mitattuun hitsin reunan pyöristykseen ja liittymäkulmaan voidaan yksinkertaista kaavaa käyttämällä laskea hitsin jännityskonsentraatio ja näin saada alkuarvaus väsymislujuudelle. Myös muiden tekijöiden tiedetään vaikuttavan hitsin väsymislujuuteen, joten pyöristyksen ja liittymäkulman avulla tehdyt arviot eivät ole absoluuttisen oikeita. Tämä havaittiin väsytyskokeilla, joista yhdessä väsymisvaurio ei syntynyt suurimmankaan jännityskonsentraation alueella.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Työn tavoitteena oli tuottaa rakenteellisen jouston huomioiva monikappaledynmiikan simulointiohjelma Matlab-ympäristöön. Rakenteellinen jousto huomioitiin kelluvan koordinaatiston menetelmällä ja joustavuutta kuvaavat muodot ratkaistiin elementtimenetelmällä. Tehdyn ohjelman avulla voidaan koostaa joustavista kappaleista koostuvia avaruusmekanismeja ja tutkia niiden dynaamista käyttäytymistä. Simulointitulosta verrattiin kaupallisen ohjelmiston tuottamaan tulokseen. Työssä havaittiin, että kelluvan koordinaatiston menetelmä on käyttökelpoinen reaaliaikaiseen simulointiin. Työssä toteutetun ohjelman tulokset vastasivat kaupallisen simulointiohjelman tuloksia.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Painelajittelussa sellusta poistetaan epäpuhtauksia. Painelajittimien suunnittelussa on tärkeää ymmärtää lajittimessa tapahtuvia ilmiöitä. Työn tavoitteena oli kehittää kuvaamiseen perustuva mittausjärjestelmä kuitujen liikkeiden mittaamista varten. Mittauksen kohteena ovat sellusulpun kuitujen ja epäpuhtauksien nopeudet. Kuvaamisessa käytetyllä kaksoisvalotuksella pystytään mittaamaan kuitujen ja roskien nopeuksia. Nopeuksien mittaamiseen kuvista kehitettiin järjestelmä ja tutkittiin mahdollisuutta automatisoida mittaaminen. Yksittäisten kuitujen havaitsemiseen sellumassasta käytettiin optisella kirkasteella kirkastettuja kuituja ja UV-valoa. Kuituja värjättiin myös mustiksi ja kuvattiin näkyvällä valolla. Kaksoisvalotukseen käytettiin kahta stroboskooppia. Prosessin kuvaamisessa käytettiin ulkoisella herätteellä ohjattavaa kameraa. Kuvan tuomiseen kameralle ja kohteen valaistukseen käytettiin boroskooppia. Saatujen kuvien käsittelyä ja nopeuksien mittausta varten tehtiin tietokoneohjelma. Käytetyn boroskoopin valovoima ei ollut riittävä kuvausten suorittamiseen, mutta muilta osin laitteisto havaittiin toimivaksi. Kuitujen ja roskien nopeuksia pystyttiin laskemaan ohjelmalla kuvista, joita otettiin ilman boroskooppia. Mittaustiedon hankinnan automatisointi näyttää mahdolliselta tekemällä muutoksia kuvauslaitteistoon.