959 resultados para fractal based metallo-dielectric structures
Resumo:
The synthesis of zirconia-based ordered mesoporous structures for catalytic applications is a research area under development. These systems are also potential candidates as anodes in intermediate temperature solid oxide fuel cells (it-SOFC) due to an enhancement on their surface area [1-4]. The structural features of mesoporous zirconia-ceria materials in combination with oxygen storage/release capacity (OSC) are crucial for various catalytic reactions. The direct use of hydrocarbons as fuel for the SOFC (instead of pure H2), without the necessity of reforming and purification reactors can improve global efficiency of these systems [4]. The X-ray diffraction data showed that ZrO2-x%CeO2 samples with x>50 are formed by a larger fraction of the cubic phase (spatial group Fm3m), while for x<50 the major crystalline structure is the tetragonal phase (spatial group P42/nmc). The crystallite size of the cubic phase increases with increase in ceria content. The tetragonal crystallite size decreases when ceria content increases. After impregnation, the Rietveld analysis showed a NiO content around 60wt.% for all samples. The lattice parameters for the ZrO2 tetragonal phase are lower for higher ZrO2 contents, while for all samples the cubic NiO and CeO2 parameters do not present changes. The calculated densities are higher for higher ceria content, as expected. The crystallite size of NiO are similar (~20nm) for all samples and 55nm for the NiO standard. Nitrogen adsorption experiments revealed a broader particle size distribution for higher CeO2 content. The superficial area values were around 35m2/g for all samples, the average pore diameter and pore volumes were higher when increasing ceria content. After NiO impregnation the particle size distribution was the same for all samples, with two pore sizes, the first around 3nm and a broader peak around 10nm. The superficial area increased to approximately 45m2/g for all samples, and the pore volume was also higher after impregnation and increased when ceria content increased. These results point up that the impregnation of NiO improves the textural characteristics of the pristine material. The complementary TEM/EDS images present a homogeneous coating of NiO particles over the ZrO2-x%CeO2 support, showing that these samples are excellent for catalysis applications. [1] D. Y. Zhao, J. Feng, Q. Huo, N. Melosh, G. H. Fredrickson, B. F. Chmelka, G. D. Stucky, Science 279, 548-552 (1998). [2] C. Yu, Y. Yu, D. Zhao, Chem. Comm. 575-576 (2000). [3] A. Trovarelli, M. Boaro, E. Rocchini, C. de Leitenburg, G. Dolcetti, J. Alloys Compd. 323-324 (2001) 584-591. [4] S. Larrondo, M. A. Vidal, B. Irigoyen, A. F. Craievich, D. G. Lamas, I. O. Fábregas, et al. Catal. Today 107–108 (2005) 53-59.
Resumo:
In der vorliegenden Arbeit wurden die bioinformatischen Methoden der Homologie-Modellierung und Molekularen Modellierung dazu benutzt, die dreidimensionalen Strukturen verschiedenster Proteine vorherzusagen und zu analysieren. Experimentelle Befunde aus Laborversuchen wurden dazu benutzt, die Genauigkeit der Homologie-Modelle zu erhöhen. Die Ergebnisse aus den Modellierungen wurden wiederum dazu benutzt, um neue experimentelle Versuche vorzuschlagen. Anhand der erstellten Modelle und bekannten Kristallstrukturen aus der Protein-Datenbank PDB wurde die Struktur-Funktionsbeziehung verschiedener Tyrosinasen untersucht. Dazu gehörten sowohl die Tyrosinase des Bakteriums Streptomyces als auch die Tyrosinase der Hausmaus. Aus den vergleichenden Strukturanalysen der Tyrosinasen resultierten Mechanismen für die Monophenolhydroxylase-Aktivität der Tyrosinasen sowie für den Import der Kupferionen ins aktive Zentrum. Es konnte der Beweis geführt werden, daß die Blockade des CuA-Zentrums tatsächlich der Grund für die unterschiedliche Aktivität von Tyrosinasen und Catecholoxidasen ist. Zum ersten Mal konnte mit der Maus-Tyrosinase ein vollständiges Strukturmodell einer Säugetier-Tyrosinase erstellt werden, das dazu in der Lage ist, die Mechanismen bekannter Albino-Mutationen auf molekularer Ebene zu erklären. Die auf der Basis des ermittelten 3D-Modells gewonnenen Erkenntnisse über die Wichtigkeit bestimmter Aminosäuren für die Funktion wurde durch gerichtete Mutagenese an der rekombinant hergestellten Maus-Tyrosinase getestet und bestätigt. Weiterhin wurde die Struktur der Tyrosinase des Krebses Palinurus elephas durch eine niedrigaufgelöste 3D-Rekonstruktion aus elektronenmikroskopischen Bildern aufgeklärt. Der zweite große Themenkomplex umfasst die Strukturanalyse der Lichtsammlerkomplexe LHCI-730 und LHCII. Im Falle des LHCII konnte der Oligomerisierungszustand der LHCMoleküle mit diskreten Konformationen des N-Terminus korreliert werden. Auch hier kam eine Kombination von Homologie-Modellierung und einer experimentellen Methode, der Elektronen-Spin-Resonanz-Messung, zum Einsatz. Die Änderung des Oligomerisierungszustands des LHCII kontrolliert den Energiezufluß zu den Photosystemen PS I und PS II. Des Weiteren wurde ein vollständiges Modell des LHCI-730 erstellt, um die Auswirkungen gerichteter Mutagenese auf das Dimerisierungsverhalten zu untersuchen. Auf Basis dieses Modells wurden die Wechselwirkungen zwischen den Monomeren Lhca1 und Lhca4 evaluiert und potentielle Bindungspartner identifiziert.
Resumo:
The thesis can be divided in four parts and summarized as follows:(i) The investigation and development of a continuous flow synthesis procedure affording end-functional polymers by anionic polymerization and subsequent termination in one reaction step and on a multigram scale was carried out. Furthermore, the implementation of not only a single hydroxyl but multiple orthogonal functionalities at the chain terminus was achieved by utilizing individually designed, functional epoxide-based end-capping reagents.(ii) In an additional step, the respective polymers were used as macroinitiators to prepare in-chain functionalized block copolymers and star polymers bearing intriguing novel structural and material properties. Thus, the second part of this thesis presents the utilization of end-functional polymers as precursors for the synthesis of amphiphilic complex and in some cases unprecedented macromolecular architectures, such as miktoarm star polymers based on poly(vinyl pyridine), poly(vinyl ferrocene) and PEO.(iii) Based on these structures, the third part of this thesis represents a detailed investigation of the preparation of stimuli-responsive ultrathin polymer films, using amphiphilic junction point-reactive block copolymers. The single functionality at the block interface can be employed as anchor group for the covalent attachment on surfaces. Furthermore, the change of surface properties was studied by applying different external stimuli.(iv) An additional topic related to the oxyanionic polymerizations carried out in the context of this thesis was the investigation of viscoelastic properties of different hyperbranched polyethers, inspired by the recent and intense research activities in the field of biomedical applications of multi-functional hyperbranched materials.
Resumo:
The flipping of membrane-embedded lipids containing large, polar head groups is slow and energetically unfavourable, and is therefore catalysed by flippases, the mechanisms of which are unknown. A prominent example of a flipping reaction is the translocation of lipid-linked oligosaccharides that serve as donors in N-linked protein glycosylation. In Campylobacter jejuni, this process is catalysed by the ABC transporter PglK. Here we present a mechanism of PglK-catalysed lipid-linked oligosaccharide flipping based on crystal structures in distinct states, a newly devised in vitro flipping assay, and in vivo studies. PglK can adopt inward- and outward-facing conformations in vitro, but only outward-facing states are required for flipping. While the pyrophosphate-oligosaccharide head group of lipid-linked oligosaccharides enters the translocation cavity and interacts with positively charged side chains, the lipidic polyprenyl tail binds and activates the transporter but remains exposed to the lipid bilayer during the reaction. The proposed mechanism is distinct from the classical alternating-access model applied to other transporters.
Resumo:
Discriminating patients with a low risk of progression from those with lethal prostate cancer is one of the main challenges in prostate cancer management. Indeed, such discrimination is essential if we aim to avoid overtreatment in men with indolent disease and to improve survival in those men with lethal disease. We are reporting on the current literature on such prognostic tools that are now available, their clinical role and their limitations in individualizing care. There is an urgent need to incorporate such genomic tools into new platform-based clinical trial structures to further develop and validate prognostic and predictive biomarkers and provide prostate cancer patients with an effective and cost-efficient access to new drugs in the setting of personalized treatment.
Resumo:
Essential biological processes are governed by organized, dynamic interactions between multiple biomolecular systems. Complexes are thus formed to enable the biological function and get dissembled as the process is completed. Examples of such processes include the translation of the messenger RNA into protein by the ribosome, the folding of proteins by chaperonins or the entry of viruses in host cells. Understanding these fundamental processes by characterizing the molecular mechanisms that enable then, would allow the (better) design of therapies and drugs. Such molecular mechanisms may be revealed trough the structural elucidation of the biomolecular assemblies at the core of these processes. Various experimental techniques may be applied to investigate the molecular architecture of biomolecular assemblies. High-resolution techniques, such as X-ray crystallography, may solve the atomic structure of the system, but are typically constrained to biomolecules of reduced flexibility and dimensions. In particular, X-ray crystallography requires the sample to form a three dimensional (3D) crystal lattice which is technically di‑cult, if not impossible, to obtain, especially for large, dynamic systems. Often these techniques solve the structure of the different constituent components within the assembly, but encounter difficulties when investigating the entire system. On the other hand, imaging techniques, such as cryo-electron microscopy (cryo-EM), are able to depict large systems in near-native environment, without requiring the formation of crystals. The structures solved by cryo-EM cover a wide range of resolutions, from very low level of detail where only the overall shape of the system is visible, to high-resolution that approach, but not yet reach, atomic level of detail. In this dissertation, several modeling methods are introduced to either integrate cryo-EM datasets with structural data from X-ray crystallography, or to directly interpret the cryo-EM reconstruction. Such computational techniques were developed with the goal of creating an atomic model for the cryo-EM data. The low-resolution reconstructions lack the level of detail to permit a direct atomic interpretation, i.e. one cannot reliably locate the atoms or amino-acid residues within the structure obtained by cryo-EM. Thereby one needs to consider additional information, for example, structural data from other sources such as X-ray crystallography, in order to enable such a high-resolution interpretation. Modeling techniques are thus developed to integrate the structural data from the different biophysical sources, examples including the work described in the manuscript I and II of this dissertation. At intermediate and high-resolution, cryo-EM reconstructions depict consistent 3D folds such as tubular features which in general correspond to alpha-helices. Such features can be annotated and later on used to build the atomic model of the system, see manuscript III as alternative. Three manuscripts are presented as part of the PhD dissertation, each introducing a computational technique that facilitates the interpretation of cryo-EM reconstructions. The first manuscript is an application paper that describes a heuristics to generate the atomic model for the protein envelope of the Rift Valley fever virus. The second manuscript introduces the evolutionary tabu search strategies to enable the integration of multiple component atomic structures with the cryo-EM map of their assembly. Finally, the third manuscript develops further the latter technique and apply it to annotate consistent 3D patterns in intermediate-resolution cryo-EM reconstructions. The first manuscript, titled An assembly model for Rift Valley fever virus, was submitted for publication in the Journal of Molecular Biology. The cryo-EM structure of the Rift Valley fever virus was previously solved at 27Å-resolution by Dr. Freiberg and collaborators. Such reconstruction shows the overall shape of the virus envelope, yet the reduced level of detail prevents the direct atomic interpretation. High-resolution structures are not yet available for the entire virus nor for the two different component glycoproteins that form its envelope. However, homology models may be generated for these glycoproteins based on similar structures that are available at atomic resolutions. The manuscript presents the steps required to identify an atomic model of the entire virus envelope, based on the low-resolution cryo-EM map of the envelope and the homology models of the two glycoproteins. Starting with the results of the exhaustive search to place the two glycoproteins, the model is built iterative by running multiple multi-body refinements to hierarchically generate models for the different regions of the envelope. The generated atomic model is supported by prior knowledge regarding virus biology and contains valuable information about the molecular architecture of the system. It provides the basis for further investigations seeking to reveal different processes in which the virus is involved such as assembly or fusion. The second manuscript was recently published in the of Journal of Structural Biology (doi:10.1016/j.jsb.2009.12.028) under the title Evolutionary tabu search strategies for the simultaneous registration of multiple atomic structures in cryo-EM reconstructions. This manuscript introduces the evolutionary tabu search strategies applied to enable a multi-body registration. This technique is a hybrid approach that combines a genetic algorithm with a tabu search strategy to promote the proper exploration of the high-dimensional search space. Similar to the Rift Valley fever virus, it is common that the structure of a large multi-component assembly is available at low-resolution from cryo-EM, while high-resolution structures are solved for the different components but lack for the entire system. Evolutionary tabu search strategies enable the building of an atomic model for the entire system by considering simultaneously the different components. Such registration indirectly introduces spatial constrains as all components need to be placed within the assembly, enabling the proper docked in the low-resolution map of the entire assembly. Along with the method description, the manuscript covers the validation, presenting the benefit of the technique in both synthetic and experimental test cases. Such approach successfully docked multiple components up to resolutions of 40Å. The third manuscript is entitled Evolutionary Bidirectional Expansion for the Annotation of Alpha Helices in Electron Cryo-Microscopy Reconstructions and was submitted for publication in the Journal of Structural Biology. The modeling approach described in this manuscript applies the evolutionary tabu search strategies in combination with the bidirectional expansion to annotate secondary structure elements in intermediate resolution cryo-EM reconstructions. In particular, secondary structure elements such as alpha helices show consistent patterns in cryo-EM data, and are visible as rod-like patterns of high density. The evolutionary tabu search strategy is applied to identify the placement of the different alpha helices, while the bidirectional expansion characterizes their length and curvature. The manuscript presents the validation of the approach at resolutions ranging between 6 and 14Å, a level of detail where alpha helices are visible. Up to resolution of 12 Å, the method measures sensitivities between 70-100% as estimated in experimental test cases, i.e. 70-100% of the alpha-helices were correctly predicted in an automatic manner in the experimental data. The three manuscripts presented in this PhD dissertation cover different computation methods for the integration and interpretation of cryo-EM reconstructions. The methods were developed in the molecular modeling software Sculptor (http://sculptor.biomachina.org) and are available for the scientific community interested in the multi-resolution modeling of cryo-EM data. The work spans a wide range of resolution covering multi-body refinement and registration at low-resolution along with annotation of consistent patterns at high-resolution. Such methods are essential for the modeling of cryo-EM data, and may be applied in other fields where similar spatial problems are encountered, such as medical imaging.
Resumo:
El estudio de materiales, especialmente biológicos, por medios no destructivos está adquiriendo una importancia creciente tanto en las aplicaciones científicas como industriales. Las ventajas económicas de los métodos no destructivos son múltiples. Existen numerosos procedimientos físicos capaces de extraer información detallada de las superficie de la madera con escaso o nulo tratamiento previo y mínima intrusión en el material. Entre los diversos métodos destacan las técnicas ópticas y las acústicas por su gran versatilidad, relativa sencillez y bajo coste. Esta tesis pretende establecer desde la aplicación de principios simples de física, de medición directa y superficial, a través del desarrollo de los algoritmos de decisión mas adecuados basados en la estadística, unas soluciones tecnológicas simples y en esencia, de coste mínimo, para su posible aplicación en la determinación de la especie y los defectos superficiales de la madera de cada muestra tratando, en la medida de lo posible, no alterar su geometría de trabajo. Los análisis desarrollados han sido los tres siguientes: El primer método óptico utiliza las propiedades de la luz dispersada por la superficie de la madera cuando es iluminada por un laser difuso. Esta dispersión produce un moteado luminoso (speckle) cuyas propiedades estadísticas permiten extraer propiedades muy precisas de la estructura tanto microscópica como macroscópica de la madera. El análisis de las propiedades espectrales de la luz laser dispersada genera ciertos patrones mas o menos regulares relacionados con la estructura anatómica, composición, procesado y textura superficial de la madera bajo estudio que ponen de manifiesto características del material o de la calidad de los procesos a los que ha sido sometido. El uso de este tipo de láseres implica también la posibilidad de realizar monitorizaciones de procesos industriales en tiempo real y a distancia sin interferir con otros sensores. La segunda técnica óptica que emplearemos hace uso del estudio estadístico y matemático de las propiedades de las imágenes digitales obtenidas de la superficie de la madera a través de un sistema de scanner de alta resolución. Después de aislar los detalles mas relevantes de las imágenes, diversos algoritmos de clasificacion automatica se encargan de generar bases de datos con las diversas especies de maderas a las que pertenecían las imágenes, junto con los márgenes de error de tales clasificaciones. Una parte fundamental de las herramientas de clasificacion se basa en el estudio preciso de las bandas de color de las diversas maderas. Finalmente, numerosas técnicas acústicas, tales como el análisis de pulsos por impacto acústico, permiten complementar y afinar los resultados obtenidos con los métodos ópticos descritos, identificando estructuras superficiales y profundas en la madera así como patologías o deformaciones, aspectos de especial utilidad en usos de la madera en estructuras. La utilidad de estas técnicas esta mas que demostrada en el campo industrial aun cuando su aplicación carece de la suficiente expansión debido a sus altos costes y falta de normalización de los procesos, lo cual hace que cada análisis no sea comparable con su teórico equivalente de mercado. En la actualidad gran parte de los esfuerzos de investigación tienden a dar por supuesto que la diferenciación entre especies es un mecanismo de reconocimiento propio del ser humano y concentran las tecnologías en la definición de parámetros físicos (módulos de elasticidad, conductividad eléctrica o acústica, etc.), utilizando aparatos muy costosos y en muchos casos complejos en su aplicación de campo. Abstract The study of materials, especially the biological ones, by non-destructive techniques is becoming increasingly important in both scientific and industrial applications. The economic advantages of non-destructive methods are multiple and clear due to the related costs and resources necessaries. There are many physical processes capable of extracting detailed information on the wood surface with little or no previous treatment and minimal intrusion into the material. Among the various methods stand out acoustic and optical techniques for their great versatility, relative simplicity and low cost. This thesis aims to establish from the application of simple principles of physics, surface direct measurement and through the development of the more appropriate decision algorithms based on statistics, a simple technological solutions with the minimum cost for possible application in determining the species and the wood surface defects of each sample. Looking for a reasonable accuracy without altering their work-location or properties is the main objetive. There are three different work lines: Empirical characterization of wood surfaces by means of iterative autocorrelation of laser speckle patterns: A simple and inexpensive method for the qualitative characterization of wood surfaces is presented. it is based on the iterative autocorrelation of laser speckle patterns produced by diffuse laser illumination of the wood surfaces. The method exploits the high spatial frequency content of speckle images. A similar approach with raw conventional photographs taken with ordinary light would be very difficult. A few iterations of the algorithm are necessary, typically three or four, in order to visualize the most important periodic features of the surface. The processed patterns help in the study of surface parameters, to design new scattering models and to classify the wood species. Fractal-based image enhancement techniques inspired by differential interference contrast microscopy: Differential interference contrast microscopy is a very powerful optical technique for microscopic imaging. Inspired by the physics of this type of microscope, we have developed a series of image processing algorithms aimed at the magnification, noise reduction, contrast enhancement and tissue analysis of biological samples. These algorithms use fractal convolution schemes which provide fast and accurate results with a performance comparable to the best present image enhancement algorithms. These techniques can be used as post processing tools for advanced microscopy or as a means to improve the performance of less expensive visualization instruments. Several examples of the use of these algorithms to visualize microscopic images of raw pine wood samples with a simple desktop scanner are provided. Wood species identification using stress-wave analysis in the audible range: Stress-wave analysis is a powerful and flexible technique to study mechanical properties of many materials. We present a simple technique to obtain information about the species of wood samples using stress-wave sounds in the audible range generated by collision with a small pendulum. Stress-wave analysis has been used for flaw detection and quality control for decades, but its use for material identification and classification is less cited in the literature. Accurate wood species identification is a time consuming task for highly trained human experts. For this reason, the development of cost effective techniques for automatic wood classification is a desirable goal. Our proposed approach is fully non-invasive and non-destructive, reducing significantly the cost and complexity of the identification and classification process.
Resumo:
En esta Tesis se presentan dos líneas de investigación relacionadas y que contribuyen a las áreas de Interacción Hombre-Tecnología (o Máquina; siglas en inglés: HTI o HMI), lingüística computacional y evaluación de la experiencia del usuario. Las dos líneas en cuestión son el diseño y la evaluación centrada en el usuario de sistemas de Interacción Hombre-Máquina avanzados. En la primera parte de la Tesis (Capítulos 2 a 4) se abordan cuestiones fundamentales del diseño de sistemas HMI avanzados. El Capítulo 2 presenta una panorámica del estado del arte de la investigación en el ámbito de los sistemas conversacionales multimodales, con la que se enmarca el trabajo de investigación presentado en el resto de la Tesis. Los Capítulos 3 y 4 se centran en dos grandes aspectos del diseño de sistemas HMI: un gestor del diálogo generalizado para tratar la Interacción Hombre-Máquina multimodal y sensible al contexto, y el uso de agentes animados personificados (ECAs) para mejorar la robustez del diálogo, respectivamente. El Capítulo 3, sobre gestión del diálogo, aborda el tratamiento de la heterogeneidad de la información proveniente de las modalidades comunicativas y de los sensores externos. En este capítulo se propone, en un nivel de abstracción alto, una arquitectura para la gestión del diálogo con influjos heterogéneos de información, apoyándose en el uso de State Chart XML. En el Capítulo 4 se presenta una contribución a la representación interna de intenciones comunicativas, y su traducción a secuencias de gestos a ejecutar por parte de un ECA, diseñados específicamente para mejorar la robustez en situaciones de diálogo críticas que pueden surgir, por ejemplo, cuando se producen errores de entendimiento en la comunicación entre el usuario humano y la máquina. Se propone, en estas páginas, una extensión del Functional Mark-up Language definido en el marco conceptual SAIBA. Esta extensión permite representar actos comunicativos que realizan intenciones del emisor (la máquina) que no se pretende sean captadas conscientemente por el receptor (el usuario humano), pero con las que se pretende influirle a éste e influir el curso del diálogo. Esto se consigue mediante un objeto llamado Base de Intenciones Comunicativas (en inglés, Communication Intention Base, o CIB). La representación en el CIB de intenciones “no claradas” además de las explícitas permite la construcción de actos comunicativos que realizan simultáneamente varias intenciones comunicativas. En el Capítulo 4 también se describe un sistema experimental para el control remoto (simulado) de un asistente domótico, con autenticación de locutor para dar acceso, y con un ECA en el interfaz de cada una de estas tareas. Se incluye una descripción de las secuencias de comportamiento verbal y no verbal de los ECAs, que fueron diseñados específicamente para determinadas situaciones con objeto de mejorar la robustez del diálogo. Los Capítulos 5 a 7 conforman la parte de la Tesis dedicada a la evaluación. El Capítulo 5 repasa antecedentes relevantes en la literatura de tecnologías de la información en general, y de sistemas de interacción hablada en particular. Los principales antecedentes en el ámbito de la evaluación de la interacción sobre los cuales se ha desarrollado el trabajo presentado en esta Tesis son el Technology Acceptance Model (TAM), la herramienta Subjective Assessment of Speech System Interfaces (SASSI), y la Recomendación P.851 de la ITU-T. En el Capítulo 6 se describen un marco y una metodología de evaluación aplicados a la experiencia del usuario con sistemas HMI multimodales. Se desarrolló con este propósito un novedoso marco de evaluación subjetiva de la calidad de la experiencia del usuario y su relación con la aceptación por parte del mismo de la tecnología HMI (el nombre dado en inglés a este marco es Subjective Quality Evaluation Framework). En este marco se articula una estructura de clases de factores subjetivos relacionados con la satisfacción y aceptación por parte del usuario de la tecnología HMI propuesta. Esta estructura, tal y como se propone en la presente tesis, tiene dos dimensiones ortogonales. Primero se identifican tres grandes clases de parámetros relacionados con la aceptación por parte del usuario: “agradabilidad ” (likeability: aquellos que tienen que ver con la experiencia de uso, sin entrar en valoraciones de utilidad), rechazo (los cuales sólo pueden tener una valencia negativa) y percepción de utilidad. En segundo lugar, este conjunto clases se reproduce para distintos “niveles, o focos, percepción del usuario”. Éstos incluyen, como mínimo, un nivel de valoración global del sistema, niveles correspondientes a las tareas a realizar y objetivos a alcanzar, y un nivel de interfaz (en los casos propuestos en esta tesis, el interfaz es un sistema de diálogo con o sin un ECA). En el Capítulo 7 se presenta una evaluación empírica del sistema descrito en el Capítulo 4. El estudio se apoya en los mencionados antecedentes en la literatura, ampliados con parámetros para el estudio específico de los agentes animados (los ECAs), la auto-evaluación de las emociones de los usuarios, así como determinados factores de rechazo (concretamente, la preocupación por la privacidad y la seguridad). También se evalúa el marco de evaluación subjetiva de la calidad propuesto en el capítulo anterior. Los análisis de factores efectuados revelan una estructura de parámetros muy cercana conceptualmente a la división de clases en utilidad-agradabilidad-rechazo propuesta en dicho marco, resultado que da cierta validez empírica al marco. Análisis basados en regresiones lineales revelan estructuras de dependencias e interrelación entre los parámetros subjetivos y objetivos considerados. El efecto central de mediación, descrito en el Technology Acceptance Model, de la utilidad percibida sobre la relación de dependencia entre la intención de uso y la facilidad de uso percibida, se confirma en el estudio presentado en la presente Tesis. Además, se ha encontrado que esta estructura de relaciones se fortalece, en el estudio concreto presentado en estas páginas, si las variables consideradas se generalizan para cubrir más ampliamente las categorías de agradabilidad y utilidad contempladas en el marco de evaluación subjetiva de calidad. Se ha observado, asimismo, que los factores de rechazo aparecen como un componente propio en los análisis de factores, y además se distinguen por su comportamiento: moderan la relación entre la intención de uso (que es el principal indicador de la aceptación del usuario) y su predictor más fuerte, la utilidad percibida. Se presentan también resultados de menor importancia referentes a los efectos de los ECAs sobre los interfaces de los sistemas de diálogo y sobre los parámetros de percepción y las valoraciones de los usuarios que juegan un papel en conformar su aceptación de la tecnología. A pesar de que se observa un rendimiento de la interacción dialogada ligeramente mejor con ECAs, las opiniones subjetivas son muy similares entre los dos grupos experimentales (uno interactuando con un sistema de diálogo con ECA, y el otro sin ECA). Entre las pequeñas diferencias encontradas entre los dos grupos destacan las siguientes: en el grupo experimental sin ECA (es decir, con interfaz sólo de voz) se observó un efecto más directo de los problemas de diálogo (por ejemplo, errores de reconocimiento) sobre la percepción de robustez, mientras que el grupo con ECA tuvo una respuesta emocional más positiva cuando se producían problemas. Los ECAs parecen generar inicialmente expectativas más elevadas en cuanto a las capacidades del sistema, y los usuarios de este grupo se declaran más seguros de sí mismos en su interacción. Por último, se observan algunos indicios de efectos sociales de los ECAs: la “amigabilidad ” percibida los ECAs estaba correlada con un incremento la preocupación por la seguridad. Asimismo, los usuarios del sistema con ECAs tendían más a culparse a sí mismos, en lugar de culpar al sistema, de los problemas de diálogo que pudieran surgir, mientras que se observó una ligera tendencia opuesta en el caso de los usuarios del sistema con interacción sólo de voz. ABSTRACT This Thesis presents two related lines of research work contributing to the general fields of Human-Technology (or Machine) Interaction (HTI, or HMI), computational linguistics, and user experience evaluation. These two lines are the design and user-focused evaluation of advanced Human-Machine (or Technology) Interaction systems. The first part of the Thesis (Chapters 2 to 4) is centred on advanced HMI system design. Chapter 2 provides a background overview of the state of research in multimodal conversational systems. This sets the stage for the research work presented in the rest of the Thesis. Chapers 3 and 4 focus on two major aspects of HMI design in detail: a generalised dialogue manager for context-aware multimodal HMI, and embodied conversational agents (ECAs, or animated agents) to improve dialogue robustness, respectively. Chapter 3, on dialogue management, deals with how to handle information heterogeneity, both from the communication modalities or from external sensors. A highly abstracted architectural contribution based on State Chart XML is proposed. Chapter 4 presents a contribution for the internal representation of communication intentions and their translation into gestural sequences for an ECA, especially designed to improve robustness in critical dialogue situations such as when miscommunication occurs. We propose an extension of the functionality of Functional Mark-up Language, as envisaged in much of the work in the SAIBA framework. Our extension allows the representation of communication acts that carry intentions that are not for the interlocutor to know of, but which are made to influence him or her as well as the flow of the dialogue itself. This is achieved through a design element we have called the Communication Intention Base. Such r pr s ntation of “non- clar ” int ntions allows th construction of communication acts that carry several communication intentions simultaneously. Also in Chapter 4, an experimental system is described which allows (simulated) remote control to a home automation assistant, with biometric (speaker) authentication to grant access, featuring embodied conversation agents for each of the tasks. The discussion includes a description of the behavioural sequences for the ECAs, which were designed for specific dialogue situations with particular attention given to the objective of improving dialogue robustness. Chapters 5 to 7 form the evaluation part of the Thesis. Chapter 5 reviews evaluation approaches in the literature for information technologies, as well as in particular for speech-based interaction systems, that are useful precedents to the contributions of the present Thesis. The main evaluation precedents on which the work in this Thesis has built are the Technology Acceptance Model (TAM), the Subjective Assessment of Speech System Interfaces (SASSI) tool, and ITU-T Recommendation P.851. Chapter 6 presents the author’s work in establishing an valuation framework and methodology applied to the users’ experience with multimodal HMI systems. A novel user-acceptance Subjective Quality Evaluation Framework was developed by the author specifically for this purpose. A class structure arises from two orthogonal sets of dimensions. First we identify three broad classes of parameters related with user acceptance: likeability factors (those that have to do with the experience of using the system), rejection factors (which can only have a negative valence) and perception of usefulness. Secondly, the class structure is further broken down into several “user perception levels”; at the very least: an overall system-assessment level, task and goal-related levels, and an interface level (e.g., a dialogue system with or without an ECA). An empirical evaluation of the system described in Chapter 4 is presented in Chapter 7. The study was based on the abovementioned precedents in the literature, expanded with categories covering the inclusion of an ECA, the users’ s lf-assessed emotions, and particular rejection factors (privacy and security concerns). The Subjective Quality Evaluation Framework proposed in the previous chapter was also scrutinised. Factor analyses revealed an item structure very much related conceptually to the usefulness-likeability-rejection class division introduced above, thus giving it some empirical weight. Regression-based analysis revealed structures of dependencies, paths of interrelations, between the subjective and objective parameters considered. The central mediation effect, in the Technology Acceptance Model, of perceived usefulness on the dependency relationship of intention-to-use with perceived ease of use was confirmed in this study. Furthermore, the pattern of relationships was stronger for variables covering more broadly the likeability and usefulness categories in the Subjective Quality Evaluation Framework. Rejection factors were found to have a distinct presence as components in factor analyses, as well as distinct behaviour: they were found to moderate the relationship between intention-to-use (the main measure of user acceptance) and its strongest predictor, perceived usefulness. Insights of secondary importance are also given regarding the effect of ECAs on the interface of spoken dialogue systems and the dimensions of user perception and judgement attitude that may have a role in determining user acceptance of the technology. Despite observing slightly better performance values in the case of the system with the ECA, subjective opinions regarding both systems were, overall, very similar. Minor differences between two experimental groups (one interacting with an ECA, the other only through speech) include a more direct effect of dialogue problems (e.g., non-understandings) on perceived dialogue robustness for the voice-only interface test group, and a more positive emotional response for the ECA test group. Our findings further suggest that the ECA generates higher initial expectations, and users seem slightly more confident in their interaction with the ECA than do those without it. Finally, mild evidence of social effects of ECAs was also found: the perceived friendliness of the ECA increased security concerns, and ECA users may tend to blame themselves rather than the system when dialogue problems are encountered, while the opposite may be true for voice-only users.
Resumo:
Apert syndrome (AS) is characterized by craniosynostosis (premature fusion of cranial sutures) and severe syndactyly of the hands and feet. Two activating mutations, Ser-252 → Trp and Pro-253 → Arg, in fibroblast growth factor receptor 2 (FGFR2) account for nearly all known cases of AS. To elucidate the mechanism by which these substitutions cause AS, we determined the crystal structures of these two FGFR2 mutants in complex with fibroblast growth factor 2 (FGF2) . These structures demonstrate that both mutations introduce additional interactions between FGFR2 and FGF2, thereby augmenting FGFR2–FGF2 affinity. Moreover, based on these structures and sequence alignment of the FGF family, we propose that the Pro-253 → Arg mutation will indiscriminately increase the affinity of FGFR2 toward any FGF. In contrast, the Ser-252 → Trp mutation will selectively enhance the affinity of FGFR2 toward a limited subset of FGFs. These predictions are consistent with previous biochemical data describing the effects of AS mutations on FGF binding. Alterations in FGFR2 ligand affinity and specificity may allow inappropriate autocrine or paracrine activation of FGFR2. Furthermore, the distinct gain-of-function interactions observed in each crystal structure provide a model to explain the phenotypic variability among AS patients.
Resumo:
While the elegance and efficiency of enzymatic catalysis have long tempted chemists and biochemists with reductionist leanings to try to mimic the functions of natural enzymes in much smaller peptides, such efforts have only rarely produced catalysts with biologically interesting properties. However, the advent of genetic engineering and hybridoma technology and the discovery of catalytic RNA have led to new and very promising alternative means of biocatalyst development. Synthetic chemists have also had some success in creating nonpeptide catalysts with certain enzyme-like characteristics, although their rates and specificities are generally much poorer than those exhibited by the best novel biocatalysts based on natural structures. A comparison of the various approaches from theoretical and practical viewpoints is presented. It is suggested that, given our current level of understanding, the most fruitful methods may incorporate both iterative selection strategies and rationally chosen small perturbations, superimposed on frameworks designed by nature.
Resumo:
Very large combinatorial libraries of small molecules on solid supports can now be synthesized and each library element can be identified after synthesis by using chemical tags. These tag-encoded libraries are potentially useful in drug discovery, and, to test this utility directly, we have targeted carbonic anhydrase (carbonate dehydratase; carbonate hydro-lyase, EC 4.2.1.1) as a model. Two libraries consisting of a total of 7870 members were synthesized, and structure-activity relationships based on the structures predicted by the tags were derived. Subsequently, an active representative of each library was resynthesized (2-[N-(4-sulfamoylbenzoyl)-4'-aminocyclohexanespiro]-4-oxo-7 -hydroxy- 2,3-dihydrobenzopyran and [N-(4-sulfamoylbenzoyl)-L-leucyl]piperidine-3-carboxylic acid) and these compounds were shown to have nanomolar dissociation constants (15 and 4 nM, respectively). In addition, a focused sublibrary of 217 sulfamoylbenzamides was synthesized and revealed a clear, testable structure-activity relationship describing isozyme-selective carbonic anhydrase inhibitors.
Resumo:
On several occasions since 2001 Vladimir Putin has raised the concept of ‘Greater Europe’, a partly-integrated common space comprising mainly Russia and the European Union. This concept has never been recast into a detailed political programme. While it has been championed as‘a Europe without dividing lines’, the concept would in practice permanently split Europe into two geopolitical blocs – the Western bloc of the European Union, with Germany in the dominant role, and the Eastern bloc, consisting of the emerging Eurasian Union, with Russia in a hegemonic position. In recent years Russia has undertaken a number of initiatives aimed at implementing some elements of the concept. However, most of these have failed to become reality. In this context, we should expect Russia’s policy to focus on implementing its priority project of Eurasian integration, based on the structures of the Customs Union/the Eurasian Union. The Greater Europe project, on the other hand, will be postponed until the time when, as Moscow believes, a weakened EU will be ready to accept Russian proposals.
Resumo:
The aim of this thesis was to investigate antibacterial agents for use in disinfectant formulation in conjunction with benzalkonium chloride (BKC), and if possible, to synthesise novel agents based upon successful structures. Development of resistance to antibacterial agents following long-term exposure of P. aeruginosa to BKC was also investigated, examining cross-resistance to clinically relevant antibiotics and determining mechanisms of resistance. In this study over 50 compounds were examined for antibacterial action against P. aeruginosa, both alone and in conjunction with BKC. Successful compounds were used to design novel agents, based upon the acridine ring structure, some of which showed synergy with BKC. In 15 of the 16 strains exposed to increasing concentrations of BKC, resistance to the disinfectant arose. Strains PAO1 and OO14 were examined further, each showing stable BKC resistance and a slightly varying profile of cross-resistance. In strain PAO1 alterations in the fatty acids of the cytoplasmic membrane, increase in expression of OprG, decrease in susceptibility to EDTA as an outer membrane permeabilising agent and an increase in negativity of the cell surface charge were observed as cells became more resistant to BKC. In strain OO14 a decrease in whole cell phosphatidylcholine content, a decrease in binding/uptake of BKC and an increase in cell surface hydrophobicity were observed as cells became more resistant to BKC. Resistance to tobramycin in strain OO14 was initially high, but fell as cells were adapted to BKC, this coincided with a quantitative reduction of plasmid DNA in the cells.
Resumo:
We present recent results on experimental micro-fabrication and numerical modeling of advanced photonic devices by means of direct writing by femtosecond laser. Transverse inscription geometry was routinely used to inscribe and modify photonic devices based on waveguiding structures. Typically, standard commercially available fibers were used as a template with a pre-fabricated waveguide. Using a direct, point-by-point inscription by infrared femtosecond laser, a range of fiber-based photonic devices was fabricated including Fiber Bragg Gratings (FBG) and Long Period Gratings (LPG). Waveguides with a core of a couple of microns, periodic structures, and couplers have been also fabricated in planar geometry using the same method.
Resumo:
We present recent results on experimental micro-fabrication and numerical modeling of advanced photonic devices by means of direct writing by femtosecond laser. Transverse inscription geometry was routinely used to inscribe and modify photonic devices based on waveguiding structures. Typically, standard commercially available fibers were used as a template with a pre-fabricated waveguide. Using a direct, point-by-point inscription by infrared femtosecond laser, a range of fiber-based photonic devices was fabricated including Fiber Bragg Gratings (FBG) and Long Period Gratings (LPG). Waveguides with a core of a couple of microns, periodic structures, and couplers have been also fabricated in planar geometry using the same method.