189 resultados para FFT


Relevância:

10.00% 10.00%

Publicador:

Resumo:

En el mundo actual las aplicaciones basadas en sistemas biométricos, es decir, aquellas que miden las señales eléctricas de nuestro organismo, están creciendo a un gran ritmo. Todos estos sistemas incorporan sensores biomédicos, que ayudan a los usuarios a controlar mejor diferentes aspectos de la rutina diaria, como podría ser llevar un seguimiento detallado de una rutina deportiva, o de la calidad de los alimentos que ingerimos. Entre estos sistemas biométricos, los que se basan en la interpretación de las señales cerebrales, mediante ensayos de electroencefalografía o EEG están cogiendo cada vez más fuerza para el futuro, aunque están todavía en una situación bastante incipiente, debido a la elevada complejidad del cerebro humano, muy desconocido para los científicos hasta el siglo XXI. Por estas razones, los dispositivos que utilizan la interfaz cerebro-máquina, también conocida como BCI (Brain Computer Interface), están cogiendo cada vez más popularidad. El funcionamiento de un sistema BCI consiste en la captación de las ondas cerebrales de un sujeto para después procesarlas e intentar obtener una representación de una acción o de un pensamiento del individuo. Estos pensamientos, correctamente interpretados, son posteriormente usados para llevar a cabo una acción. Ejemplos de aplicación de sistemas BCI podrían ser mover el motor de una silla de ruedas eléctrica cuando el sujeto realice, por ejemplo, la acción de cerrar un puño, o abrir la cerradura de tu propia casa usando un patrón cerebral propio. Los sistemas de procesamiento de datos están evolucionando muy rápido con el paso del tiempo. Los principales motivos son la alta velocidad de procesamiento y el bajo consumo energético de las FPGAs (Field Programmable Gate Array). Además, las FPGAs cuentan con una arquitectura reconfigurable, lo que las hace más versátiles y potentes que otras unidades de procesamiento como las CPUs o las GPUs.En el CEI (Centro de Electrónica Industrial), donde se lleva a cabo este TFG, se dispone de experiencia en el diseño de sistemas reconfigurables en FPGAs. Este TFG es el segundo de una línea de proyectos en la cual se busca obtener un sistema capaz de procesar correctamente señales cerebrales, para llegar a un patrón común que nos permita actuar en consecuencia. Más concretamente, se busca detectar cuando una persona está quedándose dormida a través de la captación de unas ondas cerebrales, conocidas como ondas alfa, cuya frecuencia está acotada entre los 8 y los 13 Hz. Estas ondas, que aparecen cuando cerramos los ojos y dejamos la mente en blanco, representan un estado de relajación mental. Por tanto, este proyecto comienza como inicio de un sistema global de BCI, el cual servirá como primera toma de contacto con el procesamiento de las ondas cerebrales, para el posterior uso de hardware reconfigurable sobre el cual se implementarán los algoritmos evolutivos. Por ello se vuelve necesario desarrollar un sistema de procesamiento de datos en una FPGA. Estos datos se procesan siguiendo la metodología de procesamiento digital de señales, y en este caso se realiza un análisis de la frecuencia utilizando la transformada rápida de Fourier, o FFT. Una vez desarrollado el sistema de procesamiento de los datos, se integra con otro sistema que se encarga de captar los datos recogidos por un ADC (Analog to Digital Converter), conocido como ADS1299. Este ADC está especialmente diseñado para captar potenciales del cerebro humano. De esta forma, el sistema final capta los datos mediante el ADS1299, y los envía a la FPGA que se encarga de procesarlos. La interpretación es realizada por los usuarios que analizan posteriormente los datos procesados. Para el desarrollo del sistema de procesamiento de los datos, se dispone primariamente de dos plataformas de estudio, a partir de las cuales se captarán los datos para después realizar el procesamiento: 1. La primera consiste en una herramienta comercial desarrollada y distribuida por OpenBCI, proyecto que se dedica a la venta de hardware para la realización de EEG, así como otros ensayos. Esta herramienta está formada por un microprocesador, un módulo de memoria SD para el almacenamiento de datos, y un módulo de comunicación inalámbrica que transmite los datos por Bluetooth. Además cuenta con el mencionado ADC ADS1299. Esta plataforma ofrece una interfaz gráfica que sirve para realizar la investigación previa al diseño del sistema de procesamiento, al permitir tener una primera toma de contacto con el sistema. 2. La segunda plataforma consiste en un kit de evaluación para el ADS1299, desde la cual se pueden acceder a los diferentes puertos de control a través de los pines de comunicación del ADC. Esta plataforma se conectará con la FPGA en el sistema integrado. Para entender cómo funcionan las ondas más simples del cerebro, así como saber cuáles son los requisitos mínimos en el análisis de ondas EEG se realizaron diferentes consultas con el Dr Ceferino Maestu, neurofisiólogo del Centro de Tecnología Biomédica (CTB) de la UPM. Él se encargó de introducirnos en los distintos procedimientos en el análisis de ondas en electroencefalogramas, así como la forma en que se deben de colocar los electrodos en el cráneo. Para terminar con la investigación previa, se realiza en MATLAB un primer modelo de procesamiento de los datos. Una característica muy importante de las ondas cerebrales es la aleatoriedad de las mismas, de forma que el análisis en el dominio del tiempo se vuelve muy complejo. Por ello, el paso más importante en el procesamiento de los datos es el paso del dominio temporal al dominio de la frecuencia, mediante la aplicación de la transformada rápida de Fourier o FFT (Fast Fourier Transform), donde se pueden analizar con mayor precisión los datos recogidos. El modelo desarrollado en MATLAB se utiliza para obtener los primeros resultados del sistema de procesamiento, el cual sigue los siguientes pasos. 1. Se captan los datos desde los electrodos y se escriben en una tabla de datos. 2. Se leen los datos de la tabla. 3. Se elige el tamaño temporal de la muestra a procesar. 4. Se aplica una ventana para evitar las discontinuidades al principio y al final del bloque analizado. 5. Se completa la muestra a convertir con con zero-padding en el dominio del tiempo. 6. Se aplica la FFT al bloque analizado con ventana y zero-padding. 7. Los resultados se llevan a una gráfica para ser analizados. Llegados a este punto, se observa que la captación de ondas alfas resulta muy viable. Aunque es cierto que se presentan ciertos problemas a la hora de interpretar los datos debido a la baja resolución temporal de la plataforma de OpenBCI, este es un problema que se soluciona en el modelo desarrollado, al permitir el kit de evaluación (sistema de captación de datos) actuar sobre la velocidad de captación de los datos, es decir la frecuencia de muestreo, lo que afectará directamente a esta precisión. Una vez llevado a cabo el primer procesamiento y su posterior análisis de los resultados obtenidos, se procede a realizar un modelo en Hardware que siga los mismos pasos que el desarrollado en MATLAB, en la medida que esto sea útil y viable. Para ello se utiliza el programa XPS (Xilinx Platform Studio) contenido en la herramienta EDK (Embedded Development Kit), que nos permite diseñar un sistema embebido. Este sistema cuenta con: Un microprocesador de tipo soft-core llamado MicroBlaze, que se encarga de gestionar y controlar todo el sistema; Un bloque FFT que se encarga de realizar la transformada rápida Fourier; Cuatro bloques de memoria BRAM, donde se almacenan los datos de entrada y salida del bloque FFT y un multiplicador para aplicar la ventana a los datos de entrada al bloque FFT; Un bus PLB, que consiste en un bus de control que se encarga de comunicar el MicroBlaze con los diferentes elementos del sistema. Tras el diseño Hardware se procede al diseño Software utilizando la herramienta SDK(Software Development Kit).También en esta etapa se integra el sistema de captación de datos, el cual se controla mayoritariamente desde el MicroBlaze. Por tanto, desde este entorno se programa el MicroBlaze para gestionar el Hardware que se ha generado. A través del Software se gestiona la comunicación entre ambos sistemas, el de captación y el de procesamiento de los datos. También se realiza la carga de los datos de la ventana a aplicar en la memoria correspondiente. En las primeras etapas de desarrollo del sistema, se comienza con el testeo del bloque FFT, para poder comprobar el funcionamiento del mismo en Hardware. Para este primer ensayo, se carga en la BRAM los datos de entrada al bloque FFT y en otra BRAM los datos de la ventana aplicada. Los datos procesados saldrán a dos BRAM, una para almacenar los valores reales de la transformada y otra para los imaginarios. Tras comprobar el correcto funcionamiento del bloque FFT, se integra junto al sistema de adquisición de datos. Posteriormente se procede a realizar un ensayo de EEG real, para captar ondas alfa. Por otro lado, y para validar el uso de las FPGAs como unidades ideales de procesamiento, se realiza una medición del tiempo que tarda el bloque FFT en realizar la transformada. Este tiempo se compara con el tiempo que tarda MATLAB en realizar la misma transformada a los mismos datos. Esto significa que el sistema desarrollado en Hardware realiza la transformada rápida de Fourier 27 veces más rápido que lo que tarda MATLAB, por lo que se puede ver aquí la gran ventaja competitiva del Hardware en lo que a tiempos de ejecución se refiere. En lo que al aspecto didáctico se refiere, este TFG engloba diferentes campos. En el campo de la electrónica:  Se han mejorado los conocimientos en MATLAB, así como diferentes herramientas que ofrece como FDATool (Filter Design Analysis Tool).  Se han adquirido conocimientos de técnicas de procesado de señal, y en particular, de análisis espectral.  Se han mejorado los conocimientos en VHDL, así como su uso en el entorno ISE de Xilinx.  Se han reforzado los conocimientos en C mediante la programación del MicroBlaze para el control del sistema.  Se ha aprendido a crear sistemas embebidos usando el entorno de desarrollo de Xilinx usando la herramienta EDK (Embedded Development Kit). En el campo de la neurología, se ha aprendido a realizar ensayos EEG, así como a analizar e interpretar los resultados mostrados en el mismo. En cuanto al impacto social, los sistemas BCI afectan a muchos sectores, donde destaca el volumen de personas con discapacidades físicas, para los cuales, este sistema implica una oportunidad de aumentar su autonomía en el día a día. También otro sector importante es el sector de la investigación médica, donde los sistemas BCIs son aplicables en muchas aplicaciones como, por ejemplo, la detección y estudio de enfermedades cognitivas.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The ability to synthesize high molecular weight inulin was transferred to potato plants via constitutive expression of the 1-SST (sucrose:sucrose 1-fructosyltransferase) and the 1-FFT (fructan: fructan 1-fructosyltransferase) genes of globe artichoke (Cynara scolymus). The fructan pattern of tubers from transgenic potato plants represents the full spectrum of inulin molecules present in artichoke roots as shown by high-performance anion exchange chromatography, as well as size exclusion chromatography. These results demonstrate in planta that the enzymes sucrose:sucrose 1-fructosyltransferase and fructan:fructan 1-fructosyltransferase are sufficient to synthesize inulin molecules of all chain lengths naturally occurring in a given plant species. Inulin made up 5% of the dry weight of transgenic tubers, and a low level of fructan production also was observed in fully expanded leaves. Although inulin accumulation did not influence the sucrose concentration in leaves or tubers, a reduction in starch content occurred in transgenic tubers, indicating that inulin synthesis did not increase the storage capacity of the tubers.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Baseando-se em um sistema com um grau de liberdade, é apresentada neste trabalho a equação de movimento, bem como a sua resolução através das Transformadas de Fourier e da Transformada Rápida de Fourier (FFT). Através da análise da forma como são feitas as integrações nas transformadas, foram estudados e aplicados os ponderadores de Newton-Cotes na resolução da equação de movimento, de forma a aumentar substancialmente a precisão dos resultados em comparação com a forma convencional da Transformada de Fourier.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper shows the results of an experimental analysis on the bell tower of “Chiesa della Maddalena” (Mola di Bari, Italy), to better understand the structural behavior of slender masonry structures. The research aims to calibrate a numerical model by means of the Operational Modal Analysis (OMA) method. In this way realistic conclusions about the dynamic behavior of the structure are obtained. The choice of using an OMA derives from the necessity to know the modal parameters of a structure with a non-destructive testing, especially in case of cultural-historical value structures. Therefore by means of an easy and accurate process, it is possible to acquire in-situ environmental vibrations. The data collected are very important to estimate the mode shapes, the natural frequencies and the damping ratios of the structure. To analyze the data obtained from the monitoring, the Peak Picking method has been applied to the Fast Fourier Transforms (FFT) of the signals in order to identify the values of the effective natural frequencies and damping factors of the structure. The main frequencies and the damping ratios have been determined from measurements at some relevant locations. The responses have been then extrapolated and extended to the entire tower through a 3-D Finite Element Model. In this way, knowing the modes of vibration, it has been possible to understand the overall dynamic behavior of the structure.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This correspondence presents an efficient method for reconstructing a band-limited signal in the discrete domain from its crossings with a sine wave. The method makes it possible to design A/D converters that only deliver the crossing timings, which are then used to interpolate the input signal at arbitrary instants. Potentially, it may allow for reductions in power consumption and complexity in these converters. The reconstruction in the discrete domain is based on a recently-proposed modification of the Lagrange interpolator, which is readily implementable with linear complexity and efficiently, given that it re-uses known schemes for variable fractional-delay (VFD) filters. As a spin-off, the method allows one to perform spectral analysis from sine wave crossings with the complexity of the FFT. Finally, the results in the correspondence are validated in several numerical examples.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

L’oggetto di analisi del presente lavoro di tesi è il modello di Operational Excellence noto come World Class Manufacturing in particolare l’approccio allo step 6 del pilastro Professional Maintenance, dove si richiede l’implementazione di un sistema di manutenzione PREDITTIVA, la cosiddetta CBM (Conditional Based Maintenance) . Il modello a cui si fa riferimento fu teorizzato dal professore giapponese H. Yamashina verso la metà degli anni 2000 e giunse in Italia attorno al 2005, quando Fiat Group (oggi FCA) lo adottò come approccio standard alla gestione della produzione. Questo tipo di analisi, orientata verso una prospettiva pratica più che teorica, deriva direttamente da un’esperienza sul campo che ho svolto all’interno di un’azienda che ha aderito al World Class Manufacturing (WCM). Nel capitolo 1 verrà proposto un excursus delle metodologie alla base del WCM e del percorso storico che ha portato alla formulazione del modello. Nel secondo capitolo verrà proposto un caso di applicazione del WCM all'interno di un Gruppo, nella fattispecie Ariston Thermo Group (ATG). Dopo un’overview sul Gruppo e sulla storia della sua adesione al programma di miglioramento, la trattazione si focalizza sull'approccio di ATG al WCM. Nel terzo capitolo verrà introdotta la Manutenzione Professionale secondo le principali politiche manutentive schematizzate dal WCM. Verranno presentate singolarmente per sottolineare i loro obiettivi seguiti dai vantaggi e svantaggi che si possono ottenere nell’implementare ogni singola politica. Nel quarto capitolo verranno specificate sotto un aspetto prettamente pratico le varie attività svolte dalla PM così da evidenziare lo sviluppo e il miglioramento continuo che essa sta ottenendo dall’introduzione del WCM; principalmente la presentazione delle varie attività si riferiscono al passaggio allo step 6 della PM, dove verrà presentata approfonditamente elencando e analizzando tutte le attività svolte per approcciarsi alla CBM.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Short-term spectral analysis was carried out on geochemical logging data from ODP Site 704. The FFT was used to compute the amplitude spectra of short-term overlapping segments to produce depth-period-amplitude spectrograms of the logging data. The spectrograms provided a means of evaluating the significance of the observed periodic components. The periodic components that were consistently present and prominent across a given record interval were considered to be significant. Changes in the spectrogram characteristics seem to reflect changes in either lithology, sedimentation rates, or hiatuses and may therefore provide useful information to aid in stratigraphic and paleoenvironmental studies. The dominant periodicity during the late Pleistocene and Brunhes Chron (0.97 to 0.47 Ma) was determined to be > 100,000 yr whereas the upper Matuyama Chron was dominated by the 41,000-yr periodicity. These periodicities suggest that the sedimentation patterns within the upper Matuyama Chron (0.98-1.78 Ma) were influenced by the Milankovitch obliquity cycle and those within the latest Matuyama-Brunhes Chron (<0.98 Ma) by the eccentricity cycle. The Brunhes/Matuyama boundary therefore represents a major discontinuity. Periodicities observed within the lower Matuyama and the upper Gauss Chron did not correlate with any of the periodicities within the Milankovitch frequency bands.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

PbS nanocrystals were synthesized directly in the conducting polymer, poly (3 -hexylthiophene-2,5-diyl). Transmission electron microscopy shows that the PbS nanocrystals are faceted and relatively uniform in size with a mean size of 10 nm. FFT analysis of the atomic lattice planes observed in TEM and selected area electron diffraction confirm that the nanocrystals have the PbS rock salt structure. The synthesis conditions are explored to show control over the aggregation of PbS nanocrystals in the thiophene conducting polymer.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Finite Difference Time Domain (FDTD) Method and software are applied to obtain diffraction waves from modulated Gaussian plane wave illumination for right angle wedges and Fast Fourier Transform (FFT) is used to get diffraction coefficients in a wideband in the illuminated lit region. Theta and Phi polarization in 3-dimensional, TM and TE polarization in 2-dimensional cases are considered respectively for soft and hard diffraction coefficients. Results using FDTD method of perfect electric conductor (PEC) wedge are compared with asymptotic expressions from Uniform Theory of Diffraction (UTD). Extend the PEC wedges to some homogenous conducting and dielectric building materials for diffraction coefficients that are not available analytically in practical conditions. ^

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Non-Destructive Testing (NDT) of deep foundations has become an integral part of the industry's standard manufacturing processes. It is not unusual for the evaluation of the integrity of the concrete to include the measurement of ultrasonic wave speeds. Numerous methods have been proposed that use the propagation speed of ultrasonic waves to check the integrity of concrete for drilled shaft foundations. All such methods evaluate the integrity of the concrete inside the cage and between the access tubes. The integrity of the concrete outside the cage remains to be considered to determine the location of the border between the concrete and the soil in order to obtain the diameter of the drilled shaft. It is also economic to devise a methodology to obtain the diameter of the drilled shaft using the Cross-Hole Sonic Logging system (CSL). Performing such a methodology using the CSL and following the CSL tests is performed and used to check the integrity of the inside concrete, thus allowing the determination of the drilled shaft diameter without having to set up another NDT device.^ This proposed new method is based on the installation of galvanized tubes outside the shaft across from each inside tube, and performing the CSL test between the inside and outside tubes. From the performed experimental work a model is developed to evaluate the relationship between the thickness of concrete and the ultrasonic wave properties using signal processing. The experimental results show that there is a direct correlation between concrete thicknesses outside the cage and maximum amplitude of the received signal obtained from frequency domain data. This study demonstrates how this new method to measuring the diameter of drilled shafts during construction using a NDT method overcomes the limitations of currently-used methods. ^ In the other part of study, a new method is proposed to visualize and quantify the extent and location of the defects. It is based on a color change in the frequency amplitude of the signal recorded by the receiver probe in the location of defects and it is called Frequency Tomography Analysis (FTA). Time-domain data is transferred to frequency-domain data of the signals propagated between tubes using Fast Fourier Transform (FFT). Then, distribution of the FTA will be evaluated. This method is employed after CSL has determined the high probability of an anomaly in a given area and is applied to improve location accuracy and to further characterize the feature. The technique has a very good resolution and clarifies the exact depth location of any void or defect through the length of the drilled shaft for the voids inside the cage. ^ The last part of study also evaluates the effect of voids inside and outside the reinforcement cage and corrosion in the longitudinal bars on the strength and axial load capacity of drilled shafts. The objective is to quantify the extent of loss in axial strength and stiffness of drilled shafts due to presence of different types of symmetric voids and corrosion throughout their lengths.^

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the oil prospection research seismic data are usually irregular and sparsely sampled along the spatial coordinates due to obstacles in placement of geophones. Fourier methods provide a way to make the regularization of seismic data which are efficient if the input data is sampled on a regular grid. However, when these methods are applied to a set of irregularly sampled data, the orthogonality among the Fourier components is broken and the energy of a Fourier component may "leak" to other components, a phenomenon called "spectral leakage". The objective of this research is to study the spectral representation of irregularly sampled data method. In particular, it will be presented the basic structure of representation of the NDFT (nonuniform discrete Fourier transform), study their properties and demonstrate its potential in the processing of the seismic signal. In this way we study the FFT (fast Fourier transform) and the NFFT (nonuniform fast Fourier transform) which rapidly calculate the DFT (discrete Fourier transform) and NDFT. We compare the recovery of the signal using the FFT, DFT and NFFT. We approach the interpolation of seismic trace using the ALFT (antileakage Fourier transform) to overcome the problem of spectral leakage caused by uneven sampling. Applications to synthetic and real data showed that ALFT method works well on complex geology seismic data and suffers little with irregular spatial sampling of the data and edge effects, in addition it is robust and stable with noisy data. However, it is not as efficient as the FFT and its reconstruction is not as good in the case of irregular filling with large holes in the acquisition.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the oil prospection research seismic data are usually irregular and sparsely sampled along the spatial coordinates due to obstacles in placement of geophones. Fourier methods provide a way to make the regularization of seismic data which are efficient if the input data is sampled on a regular grid. However, when these methods are applied to a set of irregularly sampled data, the orthogonality among the Fourier components is broken and the energy of a Fourier component may "leak" to other components, a phenomenon called "spectral leakage". The objective of this research is to study the spectral representation of irregularly sampled data method. In particular, it will be presented the basic structure of representation of the NDFT (nonuniform discrete Fourier transform), study their properties and demonstrate its potential in the processing of the seismic signal. In this way we study the FFT (fast Fourier transform) and the NFFT (nonuniform fast Fourier transform) which rapidly calculate the DFT (discrete Fourier transform) and NDFT. We compare the recovery of the signal using the FFT, DFT and NFFT. We approach the interpolation of seismic trace using the ALFT (antileakage Fourier transform) to overcome the problem of spectral leakage caused by uneven sampling. Applications to synthetic and real data showed that ALFT method works well on complex geology seismic data and suffers little with irregular spatial sampling of the data and edge effects, in addition it is robust and stable with noisy data. However, it is not as efficient as the FFT and its reconstruction is not as good in the case of irregular filling with large holes in the acquisition.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

With the progress of devices technology, generation and use of energy ways, power quality parameters start to influence more significantly the various kinds of power consumers. Currently, there are many types of devices that analyze power quality. However, there is a need to create devices, and perform measurements and calculate parameters, find flaws, suggest changes, and to support the management of the installation. In addition, you must ensure that such devices are accessible. To maintain this balance, one magnitude measuring method should be used which does not require great resources processing or memory. The work shows that application of the Goertzel algorithm, compared with the commonly used FFT allows measurements to be made using much less hardware resources, available memory space to implement management functions. The first point of the work is the research of troubles that are more common for low voltage consumers. Then we propose the functional diagram indicate what will be measured, calculated, what problems will be detected and that solutions can be found. Through the Goertzel algorithm simulation using Scilab, is possible to calculate frequency components of a distorted signal with satisfactory results. Finally, the prototype is assembled and tests are carried out by adjusting the parameters necessary for one to maintain a reliable device without increasing its cost.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this study, we developed and improved the numerical mode matching (NMM) method which has previously been shown to be a fast and robust semi-analytical solver to investigate the propagation of electromagnetic (EM) waves in an isotropic layered medium. The applicable models, such as cylindrical waveguide, optical fiber, and borehole with earth geological formation, are generally modeled as an axisymmetric structure which is an orthogonal-plano-cylindrically layered (OPCL) medium consisting of materials stratified planarly and layered concentrically in the orthogonal directions.

In this report, several important improvements have been made to extend applications of this efficient solver to the anisotropic OCPL medium. The formulas for anisotropic media with three different diagonal elements in the cylindrical coordinate system are deduced to expand its application to more general materials. The perfectly matched layer (PML) is incorporated along the radial direction as an absorbing boundary condition (ABC) to make the NMM method more accurate and efficient for wave diffusion problems in unbounded media and applicable to scattering problems with lossless media. We manipulate the weak form of Maxwell's equations and impose the correct boundary conditions at the cylindrical axis to solve the singularity problem which is ignored by all previous researchers. The spectral element method (SEM) is introduced to more efficiently compute the eigenmodes of higher accuracy with less unknowns, achieving a faster mode matching procedure between different horizontal layers. We also prove the relationship of the field between opposite mode indices for different types of excitations, which can reduce the computational time by half. The formulas for computing EM fields excited by an electric or magnetic dipole located at any position with an arbitrary orientation are deduced. And the excitation are generalized to line and surface current sources which can extend the application of NMM to the simulations of controlled source electromagnetic techniques. Numerical simulations have demonstrated the efficiency and accuracy of this method.

Finally, the improved numerical mode matching (NMM) method is introduced to efficiently compute the electromagnetic response of the induction tool from orthogonal transverse hydraulic fractures in open or cased boreholes in hydrocarbon exploration. The hydraulic fracture is modeled as a slim circular disk which is symmetric with respect to the borehole axis and filled with electrically conductive or magnetic proppant. The NMM solver is first validated by comparing the normalized secondary field with experimental measurements and a commercial software. Then we analyze quantitatively the induction response sensitivity of the fracture with different parameters, such as length, conductivity and permeability of the filled proppant, to evaluate the effectiveness of the induction logging tool for fracture detection and mapping. Casings with different thicknesses, conductivities and permeabilities are modeled together with the fractures in boreholes to investigate their effects for fracture detection. It reveals that the normalized secondary field will not be weakened at low frequencies, ensuring the induction tool is still applicable for fracture detection, though the attenuation of electromagnetic field through the casing is significant. A hybrid approach combining the NMM method and BCGS-FFT solver based integral equation has been proposed to efficiently simulate the open or cased borehole with tilted fractures which is a non-axisymmetric model.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This dissertation consists of three distinct components: (1) “Double Rainbow,” a notated composition for an acoustic ensemble of 10 instruments, ca. 36 minutes. (2) “Appalachiana”, a fixed-media composition for electro-acoustic music and video, ca. 30 minutes, and (3) “'The Invisible Mass': Exploring Compositional Technique in Alfred Schnittke’s Second Symphony”, an analytical article.

(1) Double Rainbow is a ca. 36 minute composition in four movements scored for 10 instruments: flute, Bb clarinet (doubling on bass clarinet), tenor saxophone (doubling on alto saxophone), french horn, percussion (glockenspiel, vibraphone, wood block, 3 toms, snare drum, bass drum, suspended cymbal), piano, violin, viola, cello, and double bass. Each of the four movements of the piece explore their own distinct character and set of compositional goals. The piece is presented as a musical score and as a recording, which was extensively treated in post-production.

(2) Appalachiana, is a ca. 30 minute fixed-media composition for music and video. The musical component was created as a vehicle to showcase several approaches to electro-acoustic music composition –fft re-synthesis for time manipulation effects, the use of a custom-built software instrument which implements generative approaches to creating rhythm and pitch patterns, using a recording of rain to create rhythmic triggers for software instruments, and recording additional components with acoustic instruments. The video component transforms footage of natural landscapes filmed at several locations in North Carolina, Virginia, and West Virginia into a surreal narrative using a variety of color, lighting, distortion, and time-manipulation video effects.

(3) “‘The Invisible Mass:’ Exploring Compositional Technique in Alfred Schnittke’s Second Symphony” is an analytical article that focuses on Alfred Schnittke’s compositional technique as evidenced in the construction of his Second Symphony and discussed by the composer in a number of previously untranslated articles and interviews. Though this symphony is pivotal in the composer’s oeuvre, there are currently no scholarly articles that offer in-depth analyses of the piece. The article combines analyses of the harmony, form, and orchestration in the Second Symphony with relevant quotations from the composer, some from published and translated sources and others newly translated by the author from research at the Russian State Library in St. Petersburg. These offer a perspective on how Schnittke’s compositional technique combines systematic geometric design with keen musical intuition.