948 resultados para Frequency Modulated Signals, Parameter Estimation, Signal-to-Noise-Ratio, Simulations
Resumo:
The current standard for temperature sensitive imaging using magnetic resonance (MR) is 2-D, spoiled, fast gradient-echo (fGRE) phase-difference imaging exploiting temperature dependent changes in the proton resonance frequency (PRF). The echo-time (TE) for optimal sensitivity is larger than the typical repetition time (TR) of an fGRE sequence. Since TE must be less than TR in the fGRE sequence, this limits the technique's achievable sensitivity, spatial, and temporal resolution. This adversely affects both accuracy and volume coverage of the measurements. Accurate measurement of the rapid temperature changes associated with pulsed thermal therapies, such as high-intensity focused ultrasound (FUS), at optimal temperature sensitivity requires faster acquisition times than those currently available. ^ Use of fast MR acquisition strategies, such as interleaved echo-planar and spiral imaging, can provide the necessary increase in temporal performance and sensitivity while maintaining adequate signal-to-noise and in-plane spatial resolution. This research explored the adaptation and optimization of several fast MR acquisition methods for thermal monitoring of pulsed FUS thermal therapy. Temperature sensitivity, phase-difference noise and phase-difference to phase-difference-to noise ratio for the different pulse sequences were evaluated under varying imaging parameters in an agar gel phantom to establish optimal sequence parameters for temperature monitoring. The temperature sensitivity coefficient of the gel phantom was measured, allowing quantitative temperature extrapolations. ^ Optimized fast sequences were compared based on the ability to accurately monitor temperature changes at the focus of a high-intensity focused ultrasound unit, volume coverage, and contrast-to-noise ratio in the temperature maps. Operating parameters, which minimize complex phase-difference measurement errors introduced by use of the fast-imaging methods, were established. ^
Resumo:
Los sistemas basados en la técnica OFDM (Multiplexación por División de Frecuencias Ortogonales) son una evolución de los tradicionales sistemas FDM (Multiplexación por División de Frecuencia), gracias a la cual se consigue un mejor aprovechamiento del ancho de banda. En la actualidad los sistemas OFDM y sus variantes ocupan un lugar muy importante en las comunicaciones, estando implementados en diversos estándares como pueden ser: DVB-T (estándar de la TDT), ADSL, LTE, WIMAX, DAB (radio digital), etc. Debido a ello, en este proyecto se implementa un sistema OFDM en el que poder realizar diversas simulaciones para entender mejor su funcionamiento. Para ello nos vamos a valer de la herramienta Matlab. Los objetivos fundamentales dentro de la simulación del sistema es poner a prueba el empleo de turbo códigos (comparándolo con los códigos convolucionales tradicionales) y de un ecualizador. Todo ello con la intención de mejorar la calidad de nuestro sistema (recibir menos bits erróneos) en condiciones cada vez más adversas: relaciones señal a ruido bajas y multitrayectos. Para ello se han implementado las funciones necesarias en Matlab, así como una interfaz gráfica para que sea más sencillo de utilizar el programa y más didáctico. En los capítulos segundo y tercero de este proyecto se efectúa un estudio de las bases de los sistemas OFDM. En el segundo nos centramos más en un estudio teórico puro para después pasar en el tercero a centrarnos únicamente en la teoría de los bloques implementados en el sistema OFDM que se desarrolla en este proyecto. En el capítulo cuarto se explican las distintas opciones que se pueden llevar a cabo mediante la interfaz implementada, a la vez que se elabora un manual para el correcto uso de la misma. El quinto capítulo se divide en dos partes, en la primera se muestran las representaciones que puede realizar el programa, y en la segunda únicamente se realizan simulaciones para comprobar que tal responde nuestra sistema a distintas configuraciones de canal, y las a distintas configuraciones que hagamos nosotros de nuestro sistema (utilicemos una codificación u otra, utilicemos el ecualizador o el prefijo cíclico, etc…). Para finalizar, en el último capítulo se exponen las conclusiones obtenidas en este proyecto, así como posibles líneas de trabajo que seguir en próximas versiones del mismo. ABSTRACT. Systems based on OFDM (Orthogonal Frequency Division Multiplexing) technique are an evolution of traditional FDM (Frequency Division Multiplexing). Due to the use of OFDM systems are achieved by more efficient use of bandwidth. Nowadays, OFDM systems and variants of OFDM systems occupy a very important place in the world of communications, being implemented in standards such as DVB-T, ADSL, LTE, WiMAX, DAB (digital radio) and another more. For all these reasons, this project implements a OFDM system for performing various simulations for better understanding of OFDM system operation. The system has been simulated using Matlab. With system simulation we search to get two key objectives: to test the use of turbo codes (compared to traditional convolutional codes) and an equalizer. We do so with the intention of improving the quality of our system (receive fewer rates of bit error) in increasingly adverse conditions: lower signal-to-noise and multipath. For these reasons necessaries Matlab´s functions have been developed, and a GUI (User Graphical Interface) has been integrated so the program can be used in a easier and more didactic way. This project is divided into five chapters. In the second and third chapter of this project are developed the basis of OFDM systems. Being developed in the second one a pure theoretical study, while focusing only on block theory implemented in the OFDM system in the third one. The fourth chapter describes the options that can be carried out by the interface implemented. Furthermore the chapter is developed for the correct use of the interface. The fifth chapter is divided into two parts, the first part shows to us the representations that the program can perform, and the second one just makes simulations to check that our system responds to differents channel configurations (use of convolutional codes or turbo codes, the use of equalizer or cyclic prefix…). Finally, the last chapter presents the conclusions of this project and possible lines of work to follow in future versions.
Resumo:
The objective of this paper is the development of a building cost estimation model whose purpose is to quickly and precisely evaluate rebuilding costs for historic heritage buildings affected by catastrophic events. Specifically, this study will be applied to the monumental buildings owned by the Catholic Church that were affected by two earthquakes on May 11, 2011 in the town of Lorca. To estimate the initial total replacement cost new, calculation model will be applied which, on the one hand, will use two-dimensional metric exterior parameters and, on the other, three-dimensional interior cubic parameters. Based on the total of the analyzed buildings, and considering damage caused by the seismic event, the final reconstruction cost for the building units ruined by the earthquakes can be estimated. The proposed calculation model can also be applied to other emergency scenarios and situations for the quick estimation of construction costs necessary for rebuilding historic heritage buildings which have been affected by catastrophic events that deteriorate or ruin their structural or constructive configuration.
Resumo:
Linkage and association analyses were performed to identify loci affecting disease susceptibility by scoring previously characterized sequence variations such as microsatellites and single nucleotide polymorphisms. Lack of markers in regions of interest, as well as difficulty in adapting various methods to high-throughput settings, often limits the effectiveness of the analyses. We have adapted the Escherichia coli mismatch detection system, employing the factors MutS, MutL and MutH, for use in PCR-based, automated, high-throughput genotyping and mutation detection of genomic DNA. Optimal sensitivity and signal-to-noise ratios were obtained in a straightforward fashion because the detection reaction proved to be principally dependent upon monovalent cation concentration and MutL concentration. Quantitative relationships of the optimal values of these parameters with length of the DNA test fragment were demonstrated, in support of the translocation model for the mechanism of action of these enzymes, rather than the molecular switch model. Thus, rapid, sequence-independent optimization was possible for each new genomic target region. Other factors potentially limiting the flexibility of mismatch scanning, such as positioning of dam recognition sites within the target fragment, have also been investigated. We developed several strategies, which can be easily adapted to automation, for limiting the analysis to intersample heteroduplexes. Thus, the principal barriers to the use of this methodology, which we have designated PCR candidate region mismatch scanning, in cost-effective, high-throughput settings have been removed.
Resumo:
Carbon dioxide (CO2) has been increasing in atmospheric concentration since the Industrial Revolution. A decreasing number of stomata on leaves of land plants still provides the only morphological evidence that this man-made increase has already affected the biosphere. The current rate of CO2 responsiveness in individual long-lived species cannot be accurately determined from field studies or by controlled-environment experiments. However, the required long-term data sets can be obtained from continuous records of buried leaves from living trees in wetland ecosystems. Fine-resolution analysis of the lifetime leaf record of an individual birch (Betula pendula) indicates a gradual reduction of stomatal frequency as a phenotypic acclimation to CO2 increase. During the past four decades, CO2 increments of 1 part per million by volume resulted in a stomatal density decline of approximately 0.6%. It may be hypothesized that this plastic stomatal frequency response of deciduous tree species has evolved in conjunction with the overall Cenozoic reduction of atmospheric CO2 concentrations.
Resumo:
The subject of this thesis is the real-time implementation of algebraic derivative estimators as observers in nonlinear control of magnetic levitation systems. These estimators are based on operational calculus and implemented as FIR filters, resulting on a feasible real-time implementation. The algebraic method provide a fast, non-asymptotic state estimation. For the magnetic levitation systems, the algebraic estimators may replace the standard asymptotic observers assuring very good performance and robustness. To validate the estimators as observers in closed-loop control, several nonlinear controllers are proposed and implemented in a experimental magnetic levitation prototype. The results show an excellent performance of the proposed control laws together with the algebraic estimators.
Resumo:
Mode of access: Internet.
Resumo:
Includes index.
Resumo:
Plant vacuoles are multi-functional, developmentally varied and can occupy up to 90% of plant cells. The N-terminal propeptide (NTPP) of sweet potato sporamin and the C-terminal propeptide (CTPP) of tobacco chitinase have been developed as models to target some heterologous proteins to vacuoles but so far tested on only a few plant species, vacuole types and payload proteins. Most studies have focused on lytic and protein-storage vacuoles, which may differ substantially from the sugar-storage vacuoles in crops like sugarcane. Our results extend the evidence that NTPP of sporamin can direct heterologous proteins to vacuoles in diverse plant species and indicate that sugarcane sucrose-storage vacuoles (like the lytic vacuoles in other plant species) are hostile to heterologous proteins. A low level of cytosolic NTPP-GFP (green fluorescent protein) was detectable in most cell types in sugarcane and Arabidopsis, but only Arabidopsis mature leaf mesophyll cells accumulated NTPP-GFP to detectable levels in vacuoles. Unexpectedly, efficient developmental mis-trafficking of NTPP-GFP to chloroplasts was found in young leaf mesophyll cells of both species. Vacuolar targeting by tobacco chitinase CTPP was inefficient in sugarcane, leaving substantial cytoplasmic activity of rat lysosomal beta-glucuronidase (GUS) [ER (endoplasmic reticulum)-RGUS-CTPP]. Sporamin NTPP is a promising targeting signal for studies of vacuolar function and for metabolic engineering. Such applications must take account of the efficient developmental mis-targeting by the signal and the instability of most introduced proteins, even in storage vacuoles.
Resumo:
Advances in three-dimensional (313) electron microscopy (EM) and image processing are providing considerable improvements in the resolution of subcellular volumes, macromolecular assemblies and individual proteins. However, the recovery of high-frequency information from biological samples is hindered by specimen sensitivity to beam damage. Low dose electron cryo-microscopy conditions afford reduced beam damage but typically yield images with reduced contrast and low signal-to-noise ratios (SNRs). Here, we describe the properties of a new discriminative bilateral (DBL) filter that is based upon the bilateral filter implementation of Jiang et al. (Jiang, W., Baker, M.L., Wu, Q., Bajaj, C., Chin, W., 2003. Applications of a bilateral denoising filter in biological electron microscopy. J. Struc. Biol. 128, 82-97.). In contrast to the latter, the DBL filter can distinguish between object edges and high-frequency noise pixels through the use of an additional photometric exclusion function. As a result, high frequency noise pixels are smoothed, yet object edge detail is preserved. In the present study, we show that the DBL filter effectively reduces noise in low SNR single particle data as well as cellular tomograms of stained plastic sections. The properties of the DBL filter are discussed in terms of its usefulness for single particle analysis and for pre-processing cellular tomograms ahead of image segmentation. (c) 2006 Elsevier Inc. All rights reserved.
Resumo:
This paper investigates the performance analysis of separation of mutually independent sources in nonlinear models. The nonlinear mapping constituted by an unsupervised linear mixture is followed by an unknown and invertible nonlinear distortion, are found in many signal processing cases. Generally, blind separation of sources from their nonlinear mixtures is rather difficult. We propose using a kernel density estimator incorporated with equivariant gradient analysis to separate the sources with nonlinear distortion. The kernel density estimator parameters of which are iteratively updated to minimize the output independence expressed as a mutual information criterion. The equivariant gradient algorithm has the form of nonlinear decorrelation to perform the convergence analysis. Experiments are proposed to illustrate these results.