960 resultados para MATLAB® toolbox


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of this project is to accomplish an application software based on Matlab to calculate the radioelectrical coverage by surface wave of broadcast radiostations in the band of Medium Wave (WM) all around the world. Also, given the location of a transmitting and a receiving station, the software should be able to calculate the electric field that the receiver should receive at that specific site. In case of several transmitters, the program should search for the existence of Inter-Symbol Interference, and calculate the field strenght accordingly. The application should ask for the configuration parameters of the transmitter radiostation within a Graphical User Interface (GUI), and bring back the resulting coverage above a map of the area under study. For the development of this project, it has been used several conductivity databases of different countries, and a high-resolution elevation database (GLOBE). Also, to calculate the field strenght due to groundwave propagation, it has been used ITU GRWAVE program, which must be integrated into a Matlab interface to be used by the application developed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Tot seguit presentem un entorn per analitzar senyals de tot tipus amb LDB (Local Discriminant Bases) i MLDB (Modified Local Discriminant Bases). Aquest entorn utilitza funcions desenvolupades en el marc d’una tesi en fase de desenvolupament. Per entendre part d’aquestes funcions es requereix un nivell de coneixement avançat de processament de senyals. S’han extret dels treballs realitzats per Naoki Saito [3], que s’han agafat com a punt de partida per la realització de l’algorisme de la tesi doctoral no finalitzada de Jose Antonio Soria. Aquesta interfície desenvolupada accepta la incorporació de nous paquets i funcions. Hem deixat un menú preparat per integrar Sinus IV packet transform i Cosine IV packet transform, tot i que també podem incorporar-n’hi altres. L’aplicació consta de dues interfícies, un Assistent i una interfície principal. També hem creat una finestra per importar i exportar les variables desitjades a diferents entorns. Per fer aquesta aplicació s’han programat tots els elements de les finestres, en lloc d’utilitzar el GUIDE (Graphical User Interface Development Enviroment) de MATLAB, per tal que sigui compatible entre les diferents versions d’aquest programa. En total hem fet 73 funcions en la interfície principal (d’aquestes, 10 pertanyen a la finestra d’importar i exportar) i 23 en la de l’Assistent. En aquest treball només explicarem 6 funcions i les 3 de creació d’aquestes interfícies per no fer-lo excessivament extens. Les funcions que explicarem són les més importants, ja sigui perquè s’utilitzen sovint, perquè, segons la complexitat McCabe, són les més complicades o perquè són necessàries pel processament del senyal. Passem cada entrada de dades per part de l’usuari per funcions que ens detectaran errors en aquesta entrada, com eliminació de zeros o de caràcters que no siguin números, com comprovar que són enters o que estan dins dels límits màxims i mínims que li pertoquen.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Un dels principals problemes quan es realitza un anàlisi de contorns és la gran quantitat de dades implicades en la descripció de la figura. Per resoldre aquesta problemàtica, s’aplica la parametrització que consisteix en obtenir d’un contorn unes dades representatives amb els mínims coeficients possibles, a partir dels quals es podrà reconstruir de nou sense pèrdues molt evidents d’informació. En figures de contorns tancats, la parametrització més estudiada és l’aplicació de la transformada discreta de Fourier (DFT). Aquesta s’aplica a la seqüència de valors que descriu el comportament de les coordenades x i y al llarg de tots els punts que formen el traç. A diferència, en els contorns oberts no es pot aplicar directament la DFT ja que per fer-ho es necessita que el valor de x i de y siguin iguals tan en el primer punt del contorn com en l’últim. Això és degut al fet que la DFT representa sense error senyals periòdics. Si els senyals no acaben en el mateix punt, representa que hi ha una discontinuïtat i apareixen oscil·lacions a la reconstrucció. L’objectiu d’aquest treball és parametritzar contorns oberts amb la mateixa eficiència que s’obté en la parametrització de contorns tancats. Per dur-ho a terme, s’ha dissenyat un programa que permet aplicar la DFT en contorns oberts mitjançant la modificació de les seqüencies de x i y. A més a més, també utilitzant el programari Matlab s’han desenvolupat altres aplicacions que han permès veure diferents aspectes sobre la parametrització i com es comporten els Descriptors El·líptics de Fourier (EFD). Els resultats obtinguts han demostrat que l’aplicació dissenyada permet la parametrització de contorns oberts amb compressions òptimes, fet que facilitarà l’anàlisi quantitatiu de formes en camps com l’ecologia, medicina, geografia, entre d’altres.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Para este trabajo se ha desarrollado un programa en Matlab, que nos permite realizar ensayos con algunas de las herramientas fundamentales del análisis técnico. Concretamente nos hemos centrado en el “Indicador de Movimiento Direccional” de Wilder. El programa está formado por seis funciones que permiten descargar datos, hacer la simulación del indicador, ajustar automáticamente algunos de sus parámetros y presentar los resultados obtenidos en la simulación. Con los experimentos y simulaciones realizadas se ha visto la importancia de escoger adecuadamente los períodos de ±DIs (indicadores direccionales positivo y negativo) y el ADX (Average Directional Movement Index). También hemos visto que la reglas decisión apuntadas por autores de reconocido prestigio como Cava y Ortiz ,no siempre se comportan como cabría esperar. Se propone mejorar el rendimiento y la fiabilidad de este indicador Incluyendo alguna media móvil de los precios y el volumen de contratación, en los criterios de decisión. También se podría mejorar implementando un sistema para que se pudiesen autoajustar los criterios de decisión.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

L’objectiu del present TFM és explorar les possibilitats del programa matemàtic MATLAB i la seva eina Entorn de Disseny d’Interfícies Gràfiques d’Usuari (GUIDE), desenvolupant un programa d’anàlisi d’imatges de provetes metal·logràfiques que es pugui utilitzar per a realitzar pràctiques de laboratori de l’assignatura Tecnologia de Materials de la titulació de Grau en Enginyeria Mecatrònica que s’imparteix a la Universitat de Vic. Les àrees d’interès del treball són la Instrumentació Virtual, la programació MATLAB i les tècniques d’anàlisi d’imatges metal·logràfiques. En la memòria es posa un èmfasi especial en el disseny de la interfície i dels procediments per a efectuar les mesures. El resultat final és un programa que satisfà tots els requeriments que s’havien imposat en la proposta inicial. La interfície del programa és clara i neta, destinant molt espai a la imatge que s’analitza. L’estructura i disposició dels menús i dels comandaments ajuda a que la utilització del programa sigui fàcil i intuïtiva. El programa s’ha estructurat de manera que sigui fàcilment ampliable amb altres rutines de mesura, o amb l’automatització de les rutines existents. Al tractar-se d’un programa que funciona com un instrument de mesura, es dedica un capítol sencer de la memòria a mostrar el procediment de càlcul dels errors que s’ocasionen durant la seva utilització, amb la finalitat de conèixer el seu ordre de magnitud, i de saber-los calcular de nou en cas que variïn les condicions d’utilització. Pel que fa referència a la programació, malgrat que MATLAB no sigui un entorn de programació clàssic, sí que incorpora eines que permeten fer aplicacions no massa complexes, i orientades bàsicament a gràfics o a imatges. L’eina GUIDE simplifica la realització de la interfície d’usuari, malgrat que presenta problemes per tractar dissenys una mica complexos. Per altra banda, el codi generat per GUIDE no és accessible, cosa que no permet modificar manualment la interfície en aquells casos en els que GUIDE té problemes. Malgrat aquests petits problemes, la potència de càlcul de MATLAB compensa sobradament aquestes deficiències.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

La interacció home-màquina per mitjà de la veu cobreix moltes àrees d’investigació. Es destaquen entre altres, el reconeixement de la parla, la síntesis i identificació de discurs, la verificació i identificació de locutor i l’activació per veu (ordres) de sistemes robòtics. Reconèixer la parla és natural i simple per a les persones, però és un treball complex per a les màquines, pel qual existeixen diverses metodologies i tècniques, entre elles les Xarxes Neuronals. L’objectiu d’aquest treball és desenvolupar una eina en Matlab per al reconeixement i identificació de paraules pronunciades per un locutor, entre un conjunt de paraules possibles, i amb una bona fiabilitat dins d’uns marges preestablerts. El sistema és independent del locutor que pronuncia la paraula, és a dir, aquest locutor no haurà intervingut en el procés d’entrenament del sistema. S’ha dissenyat una interfície que permet l’adquisició del senyal de veu i el seu processament mitjançant xarxes neuronals i altres tècniques. Adaptant una part de control al sistema, es podria utilitzar per donar ordres a un robot com l’Alfa6Uvic o qualsevol altre dispositiu.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introduction : Depuis 2005, le « Test FIFA » est utilisé chez les arbitres de football, comme critère de sélection pour monter dans les échelons de l'arbitrage et chaque arbitre base son entraînement dans cet objectif. Ce test a été développé grâce aux nombreux travaux scientifiques, ayant utilisé l'analyse vidéo, afin de quantifier les activités de match des arbitres et analyser leur performance en cours de match. Objectifs : Le but de ce travail a été d'évaluer la performance de l'arbitre, lors d'un match de football, au moyen d'un accéléromètre en raison de sa facilité d'utilisation et en particulier d'évaluer si au cours du match, il existe une éventuelle diminution de la capacité de performance engendrée par la fatigue. Enfin, à la lumière des résultats, nous avons pu discuter du bien fondé du «test par intervalle proposé par la FIFA» comme moyen d'estimation de la capacité physique d'un arbitre. Méthode : Il s'agit d'une étude prospective basée sur une analyse descriptive. Les données ont été récoltées dans des stades de football suisses ≥1ère Ligue, du 01.12.2011 au 01.12.2012. Le groupe étudié était composé de 5 arbitres de football de sexe masculin, dont deux officiant en 1ère Ligue et faisant partie des talents de l'Association Cantonale Vaudoise de Football (ACVF) et trois en Super League et Challenge League. Les 5 arbitres ont été équipés d'un iPhone 3GS®, muni d'une application, capable d'enregistrer les déplacements sur le terrain (arrêt, marche et course). Le traitement des données a été effectué par un programme Matlab®, élaboré par le Laboratoire des Mesures d'Analyse du Mouvement (LMAM) de l'EPFL, tout comme l'application en question. Pour ce travail ont été considérées les phases et les fréquences d'arrêt, de marche et de course tout au long de l'évolution de la partie. Résultats : Durant les 90 minutes du match, la répartition se fait de la manière suivante : 13,74% du temps total où l'accéléromètre ne mesure aucune activité, 33,70% concernent une activité de course alors que le reste, 52,48% est de la marche. Avec l'avancement dans le match, il est constaté une augmentation des phases d'arrêt et une diminution du temps de course. Une intensité d'effort plus importante est observée lors des 15 premières minutes du match (environ  41,7% de course), alors qu'en fin de la partie, il y a une alternance de marche et de course avec des efforts de plus en plus brefs. La détermination de la médiane de durée des différents efforts a montré qu'un épisode de marche ou de course étaient de 5-6 secondes. De plus, les épisodes de marche ou de course étaient rarement >20 secondes. Discussion : Les résultats montrent que l'accéléromètre est un système de mesure facile d'utilisation, permettant un gain de temps dans l'analyse des données pour évaluer la performance sportive. Les principaux résultats de cette étude, ont mis en évidence une diminution de l'intensité des activités physiques de l'arbitre avec l'avancement du match, résultant soit de sa propre fatigue, soit de celle des joueurs dictant le rythme du jeu. Cette diminution se traduit par des déplacements de plus en plus brefs au fil du temps. La mesure de médiane du temps de course et de marche (5-6 sec) correspond à une activité aérobie pour la marche et anaérobie alactique pour la course. Par conséquent, le « test par intervalle » de la FIFA actuel ne nous semble pas adéquat en raison de sa filière énergétique de type anaérobique lactique. Conclusion : Cette étude pilote apporte un nouveau type d'instrumentation efficace et simple, jamais employé auparavant dans l'analyse des activités de match des arbitres de football. Il permet d'explorer des mouvements avec précision au fil du match et apporte un nouvel aspect sur la quantification de performance des arbitres non exploré jusqu'ici. Après analyse de l'ensemble des paramètres, il semble que le test FIFA ne soit pas adapté à la performance exigée par l'arbitrage.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Aim: When planning SIRT using 90Y microspheres, the partition model is used to refine the activity calculated by the body surface area (BSA) method to potentially improve the safety and efficacy of treatment. For this partition model dosimetry, accurate determination of mean tumor-to-normal liver ratio (TNR) is critical since it directly impacts absorbed dose estimates. This work aimed at developing and assessing a reliable methodology for the calculation of 99mTc-MAA SPECT/CT-derived TNR ratios based on phantom studies. Materials and methods: IQ NEMA (6 hot spheres) and Kyoto liver phantoms with different hot/background activity concentration ratios were imaged on a SPECT/CT (GE Infinia Hawkeye 4). For each reconstruction with the IQ phantom, TNR quantification was assessed in terms of relative recovery coefficients (RC) and image noise was evaluated in terms of coefficient of variation (COV) in the filled background. RCs were compared using OSEM with Hann, Butterworth and Gaussian filters, as well as FBP reconstruction algorithms. Regarding OSEM, RCs were assessed by varying different parameters independently, such as the number of iterations (i) and subsets (s) and the cut-off frequency of the filter (fc). The influence of the attenuation and diffusion corrections was also investigated. Furthermore, both 2D-ROIs and 3D-VOIs contouring were compared. For this purpose, dedicated Matlab© routines were developed in-house for automatic 2D-ROI/3D-VOI determination to reduce intra-user and intra-slice variability. Best reconstruction parameters and RCs obtained with the IQ phantom were used to recover corrected TNR in case of the Kyoto phantom for arbitrary hot-lesion size. In addition, we computed TNR volume histograms to better assess uptake heterogeneityResults: The highest RCs were obtained with OSEM (i=2, s=10) coupled with the Butterworth filter (fc=0.8). Indeed, we observed a global 20% RC improvement over other OSEM settings and a 50% increase as compared to the best FBP reconstruction. In any case, both attenuation and diffusion corrections must be applied, thus improving RC while preserving good image noise (COV<10%). Both 2D-ROI and 3D-VOI analysis lead to similar results. Nevertheless, we recommend using 3D-VOI since tumor uptake regions are intrinsically 3D. RC-corrected TNR values lie within 17% around the true value, substantially improving the evaluation of small volume (<15 mL) regions. Conclusions: This study reports the multi-parameter optimization of 99mTc MAA SPECT/CT images reconstruction in planning 90Y dosimetry for SIRT. In phantoms, accurate quantification of TNR was obtained using OSEM coupled with Butterworth and RC correction.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Underbody plows can be very useful tools in winter maintenance, especially when compacted snow or hard ice must be removed from the roadway. By the application of significant down-force, and the use of an appropriate cutting edge angle, compacted snow and ice can be removed very effectively by such plows, with much greater efficiency than any other tool under those circumstances. However, the successful operation of an underbody plow requires considerable skill. If too little down pressure is applied to the plow, then it will not cut the ice or compacted snow. However, if too much force is applied, then either the cutting edge may gouge the road surface, causing significant damage often to both the road surface and the plow, or the plow may ride up on the cutting edge so that it is no longer controllable by the operator. Spinning of the truck in such situations is easily accomplished. Further, excessive down force will result in rapid wear of the cutting edge. Given this need for a high level of operator skill, the operation of an underbody plow is a candidate for automation. In order to successfully automate the operation of an underbody plow, a control system must be developed that follows a set of rules that represent appropriate operation of such a plow. These rules have been developed, based upon earlier work in which operational underbody plows were instrumented to determine the loading upon them (both vertical and horizontal) and the angle at which the blade was operating.These rules have been successfully coded into two different computer programs, both using the MatLab® software. In the first program, various load and angle inputs are analyzed to determine when, whether, and how they violate the rules of operation. This program is essentially deterministic in nature. In the second program, the Simulink® package in the MatLab® software system was used to implement these rules using fuzzy logic. Fuzzy logic essentially replaces a fixed and constant rule with one that varies in such a way as to improve operational control. The development of the fuzzy logic in this simulation was achieved simply by using appropriate routines in the computer software, rather than being developed directly. The results of the computer testing and simulation indicate that a fully automated, computer controlled underbody plow is indeed possible. The issue of whether the next steps toward full automation should be taken (and by whom) has also been considered, and the possibility of some sort of joint venture between a Department of Transportation and a vendor has been suggested.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The main goal of the research described in this report was to evaluate countermeasures that agencies can use to reduce speeds as drivers enter rural communities located on high-speed roadways. The objectives of this study were as follows: * Identify and summarize countermeasures used to manage speeds in transition zones * Demonstrate the effectiveness of countermeasures that are practical for high- to low-speed transition zones * Acquire additional information about countermeasures that may show promise but lack sufficient evidence of effectiveness * Develop an application toolbox to assist small communities in selecting appropriate transition zones and effective countermeasures for entrances to small rural communities The team solicited small communities that were interested in participating in the Phase II study and several communities were also recommended. The treatments evaluated were selected by carefully considering traffic-calming treatments that have been used effectively in other countries for small rural communities, as well as the information gained from the first phase of the project. The treatments evaluated are as follows: * Transverse speed bars * Colored entrance treatment * Temporary island * Radar-activated speed limit sign * Speed feedback sign The toolbox publication and four focused tech briefs also cover the results of this work.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The main objective of this synthesis was to identify and summarize how agencies collect, analyze, and report different work-zone traffic-performance measures, which include exposure, mobility, and safety measures. The researchers also examined communicating performance to the public. This toolbox provides knowledge to help state departments of transportation (DOTs), as well as counties and cities, to better address reporting of work-zone performance.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Volumes of data used in science and industry are growing rapidly. When researchers face the challenge of analyzing them, their format is often the first obstacle. Lack of standardized ways of exploring different data layouts requires an effort each time to solve the problem from scratch. Possibility to access data in a rich, uniform manner, e.g. using Structured Query Language (SQL) would offer expressiveness and user-friendliness. Comma-separated values (CSV) are one of the most common data storage formats. Despite its simplicity, with growing file size handling it becomes non-trivial. Importing CSVs into existing databases is time-consuming and troublesome, or even impossible if its horizontal dimension reaches thousands of columns. Most databases are optimized for handling large number of rows rather than columns, therefore, performance for datasets with non-typical layouts is often unacceptable. Other challenges include schema creation, updates and repeated data imports. To address the above-mentioned problems, I present a system for accessing very large CSV-based datasets by means of SQL. It's characterized by: "no copy" approach - data stay mostly in the CSV files; "zero configuration" - no need to specify database schema; written in C++, with boost [1], SQLite [2] and Qt [3], doesn't require installation and has very small size; query rewriting, dynamic creation of indices for appropriate columns and static data retrieval directly from CSV files ensure efficient plan execution; effortless support for millions of columns; due to per-value typing, using mixed text/numbers data is easy; very simple network protocol provides efficient interface for MATLAB and reduces implementation time for other languages. The software is available as freeware along with educational videos on its website [4]. It doesn't need any prerequisites to run, as all of the libraries are included in the distribution package. I test it against existing database solutions using a battery of benchmarks and discuss the results.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The main goal of the research described in this report was to evaluate countermeasures that agencies can use to reduce speeds as drivers enter rural communities located on high-speed roadways. The objectives of this study were as follows: * Identify and summarize countermeasures used to manage speeds in transition zones * Demonstrate the effectiveness of countermeasures that are practical for high- to low-speed transition zones * Acquire additional information about countermeasures that may show promise but lack sufficient evidence of effectiveness * Develop an application toolbox to assist small communities in selecting appropriate transition zones and effective countermeasures for entrances to small rural communities The team solicited small communities that were interested in participating in the Phase II study and several communities were also recommended. The treatments evaluated were selected by carefully considering traffic-calming treatments that have been used effectively in other countries for small rural communities, as well as the information gained from the first phase of the project. The treatments evaluated are as follows: * Transverse speed bars * Colored entrance treatment * Temporary island * Radar-activated speed limit sign * Speed feedback sign The toolbox publication and four focused tech briefs also cover the results of this work.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The physical disector is a method of choice for estimating unbiased neuron numbers; nevertheless, calibration is needed to evaluate each counting method. The validity of this method can be assessed by comparing the estimated cell number with the true number determined by a direct counting method in serial sections. We reconstructed a 1/5 of rat lumbar dorsal root ganglia taken from two experimental conditions. From each ganglion, images of 200 adjacent semi-thin sections were used to reconstruct a volumetric dataset (stack of voxels). On these stacks the number of sensory neurons was estimated and counted respectively by physical disector and direct counting methods. Also, using the coordinates of nuclei from the direct counting, we simulate, by a Matlab program, disector pairs separated by increasing distances in a ganglion model. The comparison between the results of these approaches clearly demonstrates that the physical disector method provides a valid and reliable estimate of the number of sensory neurons only when the distance between the consecutive disector pairs is 60 microm or smaller. In these conditions the size of error between the results of physical disector and direct counting does not exceed 6%. In contrast when the distance between two pairs is larger than 60 microm (70-200 microm) the size of error increases rapidly to 27%. We conclude that the physical dissector method provides a reliable estimate of the number of rat sensory neurons only when the separating distance between the consecutive dissector pairs is no larger than 60 microm.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An improvement in the serological diagnostic toolbox of invasive aspergillosis (IA) is necessary. So far, most laboratories do not perform antibody detection assays at all to diagnose IA, as commercial test systems are based on crude and undefined antigen mixtures of A. fumigatus. Utilizing the A. fumigatus protein mitogillin, we could demonstrate that the use of selected characterized immunodominant antigens can improve the serodiagnosis of Aspergillus-related diseases. In an animal model we were able to identify additional 36 immunodominant antigens of a cDNA library of A. fumigatus germlings. Five selected antigens were expressed recombinantly in E. coli, purified and used for Westernblot und ELISA analyses to study the kinetics of the specific antibody response in rabbits that were infected systemically with A. fumigatus. Subsequently, the specific IgG- and IgA-antibody responses against these antigens were studied in patients suffering from proven IA and compared to healthy blood donors and patients with other forms of pneumonia. Furthermore, we examined how total IgG- and IgA-levels influence the diagnostic value of antibody detection in IA patients.