894 resultados para Nmr Phased-array
Resumo:
Contraction of cardiac muscle is regulated through the Ca2+ dependent protein-protein interactions of the troponin complex (Tn). The critical role cardiac troponin C (cTnC) plays as the Ca2+ receptor in this complex makes it an attractive target for positive inotropic compounds. In this study, the ten Met methyl groups in cTnC, [98% 13C ϵ]-Met cTnC, are used as structural markers to monitor conformational changes in cTnC and identify sites of interaction between cTnC and cardiac troponin I (cTnI) responsible for the Ca2+ dependent interactions. In addition the structural consequences that a number of Ca2+-sensitizing compounds have on free cTnC and the cTnC·cTnI complex were characterized. Using heteronuclear NMR experiments and monitoring chemical shift changes in the ten Met methyl 1H-13C correlations in 3Ca2+ cTnC when bound to cTnI revealed an anti-parallel arrangement for the two proteins such that the N-domain of cTnI interacts with the C-domain of cTnC. The large chemical shifts in Mets-81, -120, and -157 identified points of contact between the proteins that include the C-domain hydrophobic surface in cTnC and the A, B, and D helical interface located in the regulatory N-domain of cTnC. TnI association [cTnI(33–80), cTnI(86–211), or cTnI(33–211)] was found also to dramatically reduce flexibility in the D/E central linker of cTnC as monitored by line broadening in the Met 1H- 13C correlations of cTnC induced by a nitroxide spin label, MTSSL, covalently attached to cTnC at Cys 84. TnI association resulted in an extended cTnC that is unlike the compact structure observed for free cTnC. The Met 1H-13C correlations also allowed the binding characteristics of bepridil, TFP, levosimendan, and EMD 57033 to the apo, 2Ca2+, and Ca2+ saturated forms of cTnC to be determined. In addition, the location of drug binding on the 3Ca2+cTnC·cTnI complex was identified for bepridil and TFP. Use of a novel spin-labeled phenothiazine, and detection of isotope filtered NOEs, allowed identification of drug binding sites in the shallow hydrophobic cup in the C-terminal domain, and on two hydrophobic surfaces on N-regulatory domain in free 3Ca2+ cTnC. In contrast, only one N-domain drug binding site exists in 3Ca2+ cTnC·cTnI complex. The methyl groups of Met 45, 60 and 80, which are grouped in a hydrophobic patch near site II in cTnC, showed the greatest change upon titration with bepridil or TFP, suggesting that this is a critical site of drug binding in both free cTnC and when associated with cTnI. The strongest NOEs were seen for Met-60 and -80, which are located on helices C and D, respectively, of Ca2+ binding site II. These results support the conclusion that the small hydrophobic patch which includes Met-45, -60, and -80 constitutes a drug binding site, and that binding drugs to this site will lead to an increase in Ca2+ binding affinity of site II while preserving maximal cTnC activity. Thus, the subregion in cTnC makes a likely target against which to design new and selective Ca2+-sensitizing compounds. ^
Resumo:
An unusual polyketide with a new carbon skeleton, lindgomycin (1), and the recently described ascosetin (2) were extracted from mycelia and culture broth of different Lindgomycetaceae strains, which were isolated from a sponge of the Kiel Fjord in the Baltic Sea (Germany) and from the Antarctic. Their structures were established by spectroscopic means. In the new polyketide, two distinct domains, a bicyclic hydrocarbon and a tetramic acid, are connected by a bridging carbonyl. The tetramic acid substructure of compound 1 was proved to possess a unique 5-benzylpyrrolidine-2,4-dione unit. The combination of 5-benzylpyrrolidine-2,4-dione of compound 1 in its tetramic acid half and 3-methylbut-3-enoic acid pendant in its decalin half allow the assignment of a new carbon skeleton. The new compound 1 and ascosetin showed antibiotic activities with IC50 value of 5.1 (±0.2) µM and 3.2 (±0.4) µM, respectively, against methicillin-resistant Staphylococcus aureus.
Resumo:
Human activities are fundamentally altering the chemistry of the world's oceans. Ocean acidification (OA) is occurring against a background of warming and an increasing occurrence of disease outbreaks, posing a significant threat to marine organisms, communities, and ecosystems. In the current study, 1H NMR spectroscopy was used to investigate the response of the blue mussel, Mytilus edulis, to a 90-day exposure to reduced seawater pH and increased temperature, followed by a subsequent pathogenic challenge. Analysis of the metabolome revealed significant differences between male and female organisms. Furthermore, males and females are shown to respond differently to environmental stress. While males were significantly affected by reduced seawater pH, increased temperature, and a bacterial challenge, it was only a reduction in seawater pH that impacted females. Despite impacting males and females differently, stressors seem to act via a generalized stress response impacting both energy metabolism and osmotic balance in both sexes. This study therefore has important implications for the interpretation of metabolomic data in mussels, as well as the impact of environmental stress in marine invertebrates in general.
Resumo:
A compact planar array with parasitic elements is studied to be used in MIMO systems. Classical compact arrays suffer from high coupling which makes correlation and matching efficiency to be worse. A proper matching network improves these lacks although its bandwidth is low and may increase the antenna size. The proposed antenna makes use of parasitic elements to improve both correlation and efficiency. A specific software based on MoM has been developed to analyze radiating structures with several feed points. The array is optimized through a Genetic Algorithm to determine parasitic elements position in order to fulfill different figures of merit. The proposed design provides the required correlation and matching efficiency to have a good performance over a significant bandwidth.
Resumo:
A planar antenna is introduced that works as a portable system for X-band satellite communications. This antenna is low-profile and modular with dimensions of 40 × 40 × 2.5 × cm. It is composed of a square array of 144 printed circuit elements that cover a wide bandwidth (14.7%) for transmission and reception along with dual and interchangeable circular polarization. A radiation efficiency above 50% is achieved by a low-loss stripline feeding network. This printed antenna has a 3 dB beamwidth of 5°, a maximum gain of 26 dBi and an axial ratio under 1.9 dB over the entire frequency band. The complete design of the antenna is shown, and the measurements are compared with simulations to reveal very good agreement.
Resumo:
This paper presents a general systems that can be taken into account to control between elements in an antenna array. Because the digital phase shifter devices have become a strategic element and also some steps have been taken for their export by U.S. Government, this element has increased its price to the low supply in the market. Therefore, it is necessary to adopt some solutions that allow us to deal with the design and construction of antenna arrays. system based on a group of a staggered phase shift with external switching is shown, which is extrapolated array.
Resumo:
Nowadays, more a more base stations are equipped with active conformal antennas. These antenna designs combine phase shift systems with multibeam networks providing multi-beam ability and interference rejection, which optimize multiple channel systems. GEODA is a conformal adaptive antenna system designed for satellite communications. Operating at 1.7 GHz with circular polarization, it is possible to track and communicate with several satellites at once thanks to its adaptive beam. The antenna is based on a set of similar triangular arrays that are divided in subarrays of three elements called `cells'. Transmission/Receiver (T/R) modules manage beam steering by shifting the phases. A more accurate steering of the antenna GEODA could be achieved by using a multibeam network. Several multibeam network designs based on Butler network will be presented
Resumo:
A multibeam antenna study based on Butler network will be undertaken in this document. These antenna designs combines phase shift systems with multibeam networks to optimize multiple channel systems. The system will work at 1.7 GHz with circular polarization. Specifically, result simulations and measurements of 3 element triangular subarray will be shown. A 45 element triangular array will be formed by the subarrays. Using triangular subarrays, side lobes and crossing points are reduced.
Resumo:
A six inputs and three outputs structure which can be used to obtain six simultaneous beams with a triangular array of 3 elements is presented. The beam forming network is obtained combining balanced and unbalanced hybrid couplers and allows to obtain six main beams with sixty degrees of separation in azimuth direction. Simulations and measurements showing the performance of the array and other detailed results are presented
Resumo:
A compact array of monopoles with a slotted ground plane is analyzed for being used in MIMO systems. Compact arrays suffer usually from high coupling which degrades significantly MIMO benefits. Through a matching network, main drawbacks can be solved, although it tends to provide a low bandwidth. The studied design is an array of monopoles with a slot in the ground plane. The slot shape is optimized with a Genetic Algorithm and an own electromagnetic software based on MoM in order to fulfill main figures of merit within a significant bandwidth
Resumo:
El Análisis de Consumo de Recursos o Análisis de Coste trata de aproximar el coste de ejecutar un programa como una función dependiente de sus datos de entrada. A pesar de que existen trabajos previos a esta tesis doctoral que desarrollan potentes marcos para el análisis de coste de programas orientados a objetos, algunos aspectos avanzados, como la eficiencia, la precisión y la fiabilidad de los resultados, todavía deben ser estudiados en profundidad. Esta tesis aborda estos aspectos desde cuatro perspectivas diferentes: (1) Las estructuras de datos compartidas en la memoria del programa son una pesadilla para el análisis estático de programas. Trabajos recientes proponen una serie de condiciones de localidad para poder mantener de forma consistente información sobre los atributos de los objetos almacenados en memoria compartida, reemplazando éstos por variables locales no almacenadas en la memoria compartida. En esta tesis presentamos dos extensiones a estos trabajos: la primera es considerar, no sólo los accesos a los atributos, sino también los accesos a los elementos almacenados en arrays; la segunda se centra en los casos en los que las condiciones de localidad no se cumplen de forma incondicional, para lo cual, proponemos una técnica para encontrar las precondiciones necesarias para garantizar la consistencia de la información acerca de los datos almacenados en memoria. (2) El objetivo del análisis incremental es, dado un programa, los resultados de su análisis y una serie de cambios sobre el programa, obtener los nuevos resultados del análisis de la forma más eficiente posible, evitando reanalizar aquellos fragmentos de código que no se hayan visto afectados por los cambios. Los analizadores actuales todavía leen y analizan el programa completo de forma no incremental. Esta tesis presenta un análisis de coste incremental, que, dado un cambio en el programa, reconstruye la información sobre el coste del programa de todos los métodos afectados por el cambio de forma incremental. Para esto, proponemos (i) un algoritmo multi-dominio y de punto fijo que puede ser utilizado en todos los análisis globales necesarios para inferir el coste, y (ii) una novedosa forma de almacenar las expresiones de coste que nos permite reconstruir de forma incremental únicamente las funciones de coste de aquellos componentes afectados por el cambio. (3) Las garantías de coste obtenidas de forma automática por herramientas de análisis estático no son consideradas totalmente fiables salvo que la implementación de la herramienta o los resultados obtenidos sean verificados formalmente. Llevar a cabo el análisis de estas herramientas es una tarea titánica, ya que se trata de herramientas de gran tamaño y complejidad. En esta tesis nos centramos en el desarrollo de un marco formal para la verificación de las garantías de coste obtenidas por los analizadores en lugar de analizar las herramientas. Hemos implementado esta idea mediante la herramienta COSTA, un analizador de coste para programas Java y KeY, una herramienta de verificación de programas Java. De esta forma, COSTA genera las garantías de coste, mientras que KeY prueba la validez formal de los resultados obtenidos, generando de esta forma garantías de coste verificadas. (4) Hoy en día la concurrencia y los programas distribuidos son clave en el desarrollo de software. Los objetos concurrentes son un modelo de concurrencia asentado para el desarrollo de sistemas concurrentes. En este modelo, los objetos son las unidades de concurrencia y se comunican entre ellos mediante llamadas asíncronas a sus métodos. La distribución de las tareas sugiere que el análisis de coste debe inferir el coste de los diferentes componentes distribuidos por separado. En esta tesis proponemos un análisis de coste sensible a objetos que, utilizando los resultados obtenidos mediante un análisis de apunta-a, mantiene el coste de los diferentes componentes de forma independiente. Abstract Resource Analysis (a.k.a. Cost Analysis) tries to approximate the cost of executing programs as functions on their input data sizes and without actually having to execute the programs. While a powerful resource analysis framework on object-oriented programs existed before this thesis, advanced aspects to improve the efficiency, the accuracy and the reliability of the results of the analysis still need to be further investigated. This thesis tackles this need from the following four different perspectives. (1) Shared mutable data structures are the bane of formal reasoning and static analysis. Analyses which keep track of heap-allocated data are referred to as heap-sensitive. Recent work proposes locality conditions for soundly tracking field accesses by means of ghost non-heap allocated variables. In this thesis we present two extensions to this approach: the first extension is to consider arrays accesses (in addition to object fields), while the second extension focuses on handling cases for which the locality conditions cannot be proven unconditionally by finding aliasing preconditions under which tracking such heap locations is feasible. (2) The aim of incremental analysis is, given a program, its analysis results and a series of changes to the program, to obtain the new analysis results as efficiently as possible and, ideally, without having to (re-)analyze fragments of code that are not affected by the changes. During software development, programs are permanently modified but most analyzers still read and analyze the entire program at once in a non-incremental way. This thesis presents an incremental resource usage analysis which, after a change in the program is made, is able to reconstruct the upper-bounds of all affected methods in an incremental way. To this purpose, we propose (i) a multi-domain incremental fixed-point algorithm which can be used by all global analyses required to infer the cost, and (ii) a novel form of cost summaries that allows us to incrementally reconstruct only those components of cost functions affected by the change. (3) Resource guarantees that are automatically inferred by static analysis tools are generally not considered completely trustworthy, unless the tool implementation or the results are formally verified. Performing full-blown verification of such tools is a daunting task, since they are large and complex. In this thesis we focus on the development of a formal framework for the verification of the resource guarantees obtained by the analyzers, instead of verifying the tools. We have implemented this idea using COSTA, a state-of-the-art cost analyzer for Java programs and KeY, a state-of-the-art verification tool for Java source code. COSTA is able to derive upper-bounds of Java programs while KeY proves the validity of these bounds and provides a certificate. The main contribution of our work is to show that the proposed tools cooperation can be used for automatically producing verified resource guarantees. (4) Distribution and concurrency are today mainstream. Concurrent objects form a well established model for distributed concurrent systems. In this model, objects are the concurrency units that communicate via asynchronous method calls. Distribution suggests that analysis must infer the cost of the diverse distributed components separately. In this thesis we propose a novel object-sensitive cost analysis which, by using the results gathered by a points-to analysis, can keep the cost of the diverse distributed components separate.
Resumo:
Se ha diseñado y construido un array de microlentes cilíndricas de cristal líquido (CL) y se ha llevado a cabo un estudio sobre su comportamiento electroóptico. El array lenticular es novedoso en cuanto a los materiales empleados en su fabricación. Se ha utilizado Níquel como material clave para la implementación de un electrodo de alta resistividad. La combinación del electrodo de alta resistividad junto al CL (cuya impedancia paralelo es elevada) da lugar a un divisor reactivo que proporciona un gradiente de tensión hiperbólico del centro al extremo de cada lente. Este efecto, unido al alineamiento homogéneo de las moléculas de CL, permite la generación de un gradiente de índice de refracción, comportándose el dispositivo como una lente GRIN (GRadient Refraction INdex). Para la caracterización de su funcionamiento se ha analizado su perfil de fase empleando métodos interferométricos y procesamiento de imágenes. Además se han efectuado también diferentes medidas de contraste angular.