803 resultados para common agent architecture design
Resumo:
Building facilities have become important infrastructures for modern productive plants dedicated to services. In this context, the control systems of intelligent buildings have evolved while their reliability has evidently improved. However, the occurrence of faults is inevitable in systems conceived, constructed and operated by humans. Thus, a practical alternative approach is found to be very useful to reduce the consequences of faults. Yet, only few publications address intelligent building modeling processes that take into consideration the occurrence of faults and how to manage their consequences. In the light of the foregoing, a procedure is proposed for the modeling of intelligent building control systems, considersing their functional specifications in normal operation and in the of the event of faults. The proposed procedure adopts the concepts of discrete event systems and holons, and explores Petri nets and their extensions so as to represent the structure and operation of control systems for intelligent buildings under normal and abnormal situations. (C) 2012 Elsevier B.V. All rights reserved.
Resumo:
Background Staphylococcus aureus is the most common agent of septic arthritis that is a severe, rapidly progressive and destructive joint disease. Superantigens produced by S. aureus are considered the major arthritogenic factors. In this study, we compared the arthritogenic potential of five superantigen-producing staphylococcal strains. Methods Male C57BL/6 mice were intravenously infected with ATCC 19095 SEC+, N315 ST5 TSST-1+, S-70 TSST-1+, ATCC 51650 TSST-1+ and ATCC 13565 SEA+ strains. Clinical parameters as body weight, arthritis incidence and clinical score were daily evaluated. Joint histopathological analysis and spleen cytokine production were evaluated at the 14th day after infection. Results Weight loss was observed in all infected mice. ATCC 19095 SEC+, N315 ST5 TSST-1+ and S-70 TSST-1+ were arthritogenic, being the highest scores observed in ATCC 19095 SEC+ infected mice. Intermediate and lower clinical scores were observed in N315 ST5 TSST-1+ and S-70 TSST-1+ infected mice, respectively. The ATCC 13565 SEA+ strain caused death of 85% of the animals after 48 h. Arthritis triggered by the ATCC 19095 SEC+ strain was characterized by accentuated synovial hyperplasia, inflammation, pannus formation, cartilage destruction and bone erosion. Similar joint alterations were found in N315 ST5 TSST-1+ infected mice, however they were strikingly more discrete. Only minor synovial proliferation and inflammation were triggered by the S-70 TSST-1+ strain. The lowest levels of TNF-α, IL-6 and IL-17 production in response to S. aureus stimulation were found in cultures from mice infected with the less arthritogenic strains (S-70 TSST-1+ and ATCC 51650 TSST-1+). The highest production of IL-17 was detected in mice infected with the most arthritogenic strains (ATCC 19095 SEC+ and N315 ST5 TSST-1+). Conclusions Together these results demonstrated that S. aureus strains, isolated from biological samples, were able to induce a typical septic arthritis in mice. These results also suggest that the variable arthritogenicity of these strains was, at least in part, related to their differential ability to induce IL-17 production.
Resumo:
Supramolecular architectures can be built-up from a single molecular component (building block) to obtain a complex of organic or inorganic interactions creating a new emergent condensed phase of matter, such as gels, liquid crystals and solid crystal. Further the generation of multicomponent supramolecular hybrid architecture, a mix of organic and inorganic components, increases the complexity of the condensed aggregate with functional properties useful for important areas of research, like material science, medicine and nanotechnology. One may design a molecule storing a recognition pattern and programming a informed self-organization process enables to grow-up into a hierarchical architecture. From a molecular level to a supramolecular level, in a bottom-up fashion, it is possible to create a new emergent structure-function, where the system, as a whole, is open to its own environment to exchange energy, matter and information. “The emergent property of the whole assembly is superior to the sum of a singles parts”. In this thesis I present new architectures and functional materials built through the selfassembly of guanosine, in the absence or in the presence of a cation, in solution and on the surface. By appropriate manipulation of intermolecular non-covalent interactions the spatial (structural) and temporal (dynamic) features of these supramolecular architectures are controlled. Guanosine G7 (5',3'-di-decanoil-deoxi-guanosine) is able to interconvert reversibly between a supramolecular polymer and a discrete octameric species by dynamic cation binding and release. Guanosine G16 (2',3'-O-Isopropylidene-5'-O-decylguanosine) shows selectivity binding from a mix of different cation's nature. Remarkably, reversibility, selectivity, adaptability and serendipity are mutual features to appreciate the creativity of a molecular self-organization complex system into a multilevelscale hierarchical growth. The creativity - in general sense, the creation of a new thing, a new thinking, a new functionality or a new structure - emerges from a contamination process of different disciplines such as biology, chemistry, physics, architecture, design, philosophy and science of complexity.
Resumo:
Content Addressable Memory (CAM) is a special type of Complementary Metal-Oxide-Semiconductor (CMOS) storage element that allows for a parallel search operation on a memory stack in addition to the read and write operations yielded by a conventional SRAM storage array. In practice, it is often desirable to be able to store a “don’t care” state for faster searching operation. However, commercially available CAM chips are forced to accomplish this functionality by having to include two binary memory storage elements per CAM cell,which is a waste of precious area and power resources. This research presents a novel CAM circuit that achieves the “don’t care” functionality with a single ternary memory storage element. Using the recent development of multiple-voltage-threshold (MVT) CMOS transistors, the functionality of the proposed circuit is validated and characteristics for performance, power consumption, noise immunity, and silicon area are presented. This workpresents the following contributions to the field of CAM and ternary-valued logic:• We present a novel Simple Ternary Inverter (STI) transistor geometry scheme for achieving ternary-valued functionality in existing SOI-CMOS 0.18µm processes.• We present a novel Ternary Content Addressable Memory based on Three-Valued Logic (3CAM) as a single-storage-element CAM cell with “don’t care” functionality.• We explore the application of macro partitioning schemes to our proposed 3CAM array to observe the benefits and tradeoffs of architecture design in the context of power, delay, and area.
Resumo:
To gain insights into the mechanisms of abrupt climate change within interglacials, we have examined the characteristics and spatial extent of a prominent, climatically induced vegetation setback during the Holsteinian interglacial (Marine Isotope Stage 11c). Based on analyses of pollen and varves of lake sediments from Dethlingen (northern Germany), this climatic oscillation, here termed the "Older Holsteinian Oscillation" (OHO), lasted 220 years. It can be subdivided into a 90-year-long decline of temperate tree taxa associated with an expansion of Pinus and herbs, and a 130-year-long recovery phase marked by the expansion of Betula and Alnus, and the subsequent recovery of temperate trees. The climate-induced nature of the OHO is corroborated by changes in diatom assemblages and ?18O measured on biogenic silica indicating an impact on the aquatic ecosystem of the Dethlingen paleolake. The OHO is widely documented in pollen records from Europe north of 50° latitude and is characterized by boreal climate conditions with cold winters from the British Isles to Poland, with a gradient of decreasing temperature and moisture availability, and increased continentality towards eastern Europe. This pattern points to a weakened influence of the westerlies and/or a stronger influence of the Siberian High. A comparison of the OHO with the 8.2 ka event of the Holocene reveals close similarities regarding the imprint on terrestrial ecosystems and the interglacial boundary conditions. Hence, in analogy to the 8.2 ka event, a transient, meltwater-induced slowdown of the North Atlantic Deep Water formation appears as a plausible trigger mechanism for the OHO. If correct, meltwater release into the North Atlantic may be a more common agent of abrupt climate change during interglacials than previously thought. We conclude that meltwater-induced climate setbacks during interglacials preferentially occurred when low rates of summer insolation increase during the preceding terminations facilitated the persistence of large-scale continental ice-sheets well into the interglacials.
Resumo:
This paper introduces a theoretical model for developing integrated degree programmes through e-learning systems as stipulated by a collaboration agreement signed by two universities. We have analysed several collaboration agreements between universities at the national, European, and transatlantic level as well as various e-learning frameworks. A conceptual model, a business model, and the architecture design are presented as part of the theoretical model. The paper presents a way of implementing e-learning systems as a tool to support inter-institutional degree collaborations, from the signing of the collaborative agreement to the implementation of the necessary services. In order to show how the theory can be tested one sample scenario is presented.
Resumo:
Hoy en día, el desarrollo tecnológico en el campo de los sistemas inteligentes de transporte (ITS por sus siglas en inglés) ha permitido dotar a los vehículos con diversos sistemas de ayuda a la conducción (ADAS, del inglés advanced driver assistance system), mejorando la experiencia y seguridad de los pasajeros, en especial del conductor. La mayor parte de estos sistemas están pensados para advertir al conductor sobre ciertas situaciones de riesgo, como la salida involuntaria del carril o la proximidad de obstáculos en el camino. No obstante, también podemos encontrar sistemas que van un paso más allá y son capaces de cooperar con el conductor en el control del vehículo o incluso relegarlos de algunas tareas tediosas. Es en este último grupo donde se encuentran los sistemas de control electrónico de estabilidad (ESP - Electronic Stability Program), el antibloqueo de frenos (ABS - Anti-lock Braking System), el control de crucero (CC - Cruise Control) y los más recientes sistemas de aparcamiento asistido. Continuando con esta línea de desarrollo, el paso siguiente consiste en la supresión del conductor humano, desarrollando sistemas que sean capaces de conducir un vehículo de forma autónoma y con un rendimiento superior al del conductor. En este trabajo se presenta, en primer lugar, una arquitectura de control para la automatización de vehículos. Esta se compone de distintos componentes de hardware y software, agrupados de acuerdo a su función principal. El diseño de la arquitectura parte del trabajo previo desarrollado por el Programa AUTOPIA, aunque introduce notables aportaciones en cuanto a la eficiencia, robustez y escalabilidad del sistema. Ahondando un poco más en detalle, debemos resaltar el desarrollo de un algoritmo de localización basado en enjambres de partículas. Este está planteado como un método de filtrado y fusión de la información obtenida a partir de los distintos sensores embarcados en el vehículo, entre los que encontramos un receptor GPS (Global Positioning System), unidades de medición inercial (IMU – Inertial Measurement Unit) e información tomada directamente de los sensores embarcados por el fabricante, como la velocidad de las ruedas y posición del volante. Gracias a este método se ha conseguido resolver el problema de la localización, indispensable para el desarrollo de sistemas de conducción autónoma. Continuando con el trabajo de investigación, se ha estudiado la viabilidad de la aplicación de técnicas de aprendizaje y adaptación al diseño de controladores para el vehículo. Como punto de partida se emplea el método de Q-learning para la generación de un controlador borroso lateral sin ningún tipo de conocimiento previo. Posteriormente se presenta un método de ajuste on-line para la adaptación del control longitudinal ante perturbaciones impredecibles del entorno, como lo son los cambios en la inclinación del camino, fricción de las ruedas o peso de los ocupantes. Para finalizar, se presentan los resultados obtenidos durante un experimento de conducción autónoma en carreteras reales, el cual se llevó a cabo en el mes de Junio de 2012 desde la población de San Lorenzo de El Escorial hasta las instalaciones del Centro de Automática y Robótica (CAR) en Arganda del Rey. El principal objetivo tras esta demostración fue validar el funcionamiento, robustez y capacidad de la arquitectura propuesta para afrontar el problema de la conducción autónoma, bajo condiciones mucho más reales a las que se pueden alcanzar en las instalaciones de prueba. ABSTRACT Nowadays, the technological advances in the Intelligent Transportation Systems (ITS) field have led the development of several driving assistance systems (ADAS). These solutions are designed to improve the experience and security of all the passengers, especially the driver. For most of these systems, the main goal is to warn drivers about unexpected circumstances leading to risk situations such as involuntary lane departure or proximity to other vehicles. However, other ADAS go a step further, being able to cooperate with the driver in the control of the vehicle, or even overriding it on some tasks. Examples of this kind of systems are the anti-lock braking system (ABS), cruise control (CC) and the recently commercialised assisted parking systems. Within this research line, the next step is the development of systems able to replace the human drivers, improving the control and therefore, the safety and reliability of the vehicles. First of all, this dissertation presents a control architecture design for autonomous driving. It is made up of several hardware and software components, grouped according to their main function. The design of this architecture is based on the previous works carried out by the AUTOPIA Program, although notable improvements have been made regarding the efficiency, robustness and scalability of the system. It is also remarkable the work made on the development of a location algorithm for vehicles. The proposal is based on the emulation of the behaviour of biological swarms and its performance is similar to the well-known particle filters. The developed method combines information obtained from different sensors, including GPS, inertial measurement unit (IMU), and data from the original vehicle’s sensors on-board. Through this filtering algorithm the localization problem is properly managed, which is critical for the development of autonomous driving systems. The work deals also with the fuzzy control tuning system, a very time consuming task when done manually. An analysis of learning and adaptation techniques for the development of different controllers has been made. First, the Q-learning –a reinforcement learning method– has been applied to the generation of a lateral fuzzy controller from scratch. Subsequently, the development of an adaptation method for longitudinal control is presented. With this proposal, a final cruise control controller is able to deal with unpredictable environment disturbances, such as road slope, wheel’s friction or even occupants’ weight. As a testbed for the system, an autonomous driving experiment on real roads is presented. This experiment was carried out on June 2012, driving from San Lorenzo de El Escorial up to the Center for Automation and Robotics (CAR) facilities in Arganda del Rey. The main goal of the demonstration was validating the performance, robustness and viability of the proposed architecture to deal with the problem of autonomous driving under more demanding conditions than those achieved on closed test tracks.
Resumo:
The ability to generate entangled photon pairs over a broad wavelength range opens the door to the simultaneous distribution of entanglement to multiple users in a network by using centralized sources and flexible wavelength-division multiplexing schemes. Here, we show the design of a metropolitan optical network consisting of tree-type access networks, whereby entangled photon pairs are distributed to any pair of users, independent of their location. The network is constructed employing commercial off-the-shelf components and uses the existing infrastructure, which allows for moderate deployment costs. We further develop a channel plan and a network-architecture design to provide a direct optical path between any pair of users; thus, allowing classical and one-way quantum communication, as well as entanglement distribution. This allows the simultaneous operation of multiple quantum information technologies. Finally, we present a more flexible backbone architecture that pushes away the load limitations of the original network design by extending its reach, number of users and capabilities.
Resumo:
Entrevista
Resumo:
Centro per l'infanzia Ponzano Children. Ponzano Veneto (Treviso)
Resumo:
Los fenómenos aeroelásticos son relativamente frecuentes en las construcciones civiles modernas como edificios de oficinas, terminales de aeropuertos o fábricas. En este tipo de arquitectura aparecen con frecuencia estructuras flexibles sometidas a la acción del viento, como por ejemplo persianas formadas por láminas con distintos perfiles. Uno de estos perfiles es el perfil en Z, formado por un elemento central y dos alas laterales. Las inestabilidades de tipo galope se determinan en la práctica utilizando el criterio Glauert-Den Hartog. Este criterio precisa de la predicción exacta de la dependencia de los coeficientes aerodinámicos del ángulo de ataque. En esta tesis se presenta un estudio sistemático, tanto por métodos experimentales como numéricos de una familia completa de perfiles en Z que permite determinar sus regiones de inestabilidad frente al galope. Los análisis numéricos han sido validados con ensayos estáticos realizados en túnel de viento. Para la parte numérica se ha utilizado el código DLR TAU, que es un código de amplia utilización en la industria aeronáutica europea. En esta tesis se enfoca sobre todo a la predicción del galope en este tipo de perfiles en Z. Los resultados se presentan en forma de mapas de estabilidad. A lo largo del trabajo se realizan también comparaciones entre resultados numéricos y experimentales para varios niveles de detalle de las mallas empleadas y diversos modelos de turbulencia. ABSTRACT Aeroelastic effects are relatively common in the design of modern civil constructions such as office blocks, airport terminal buildings, and factories. Typical flexible structures exposed to the action of wind are shading devices, normally slats or louvers. A typical cross-section for such elements is a Z-shaped profile,made out of a central web and two-sidewings. Galloping instabilities are often determined in practice using the Glauert-DenHartog criterion.This criterion relies on accurate predictions of the dependence of the aerodynamic force coefficients with the angle of attack. The results of a parametric analysis based on both experimental and numerical analysis and performed on different Z-shaped louvers to determine translational galloping instability regions are presented in this thesis. These numerical analysis results have been validated with a parametric analysis of Z-shaped profiles based on static wind tunnel tests. In order to perform this validation, the DLR TAU Code, which is a standard code within the European aeronautical industry, has been used. This study highlights the focus on the numerical prediction of the effect of galloping, which is shown in a visible way, through stability maps. Comparisons between numerical and experimental data are presented with respect to various meshes and turbulence models.
Resumo:
Uno de los temas más importantes dentro del debate contemporáneo, es el que se refiere a la sostenibilidad a largo plazo de la sociedad tal y como la entendemos hoy. El ser humano está recuperando la sensibilidad perdida que le concebía como una pieza más dentro del ciclo natural de la vida. Por fin hemos entendido que no podemos ser auto suficientes e independientes del entorno natural que nos rodea. Más allá del respeto y del cuidado, está abierta la puerta del conocimiento infinito que nos brinda la naturaleza a todos los niveles y a todas las escalas. Dentro de la disciplina arquitectónica han existido ejemplos como Antoni Gaudí o Frei Otto que han referenciado su obra en el mundo Natural, encontrando en él las estrategias y bases para el diseño arquitectónico. Sin embargo han sido una minoría dentro del enorme elenco de arquitectos defensores del ángulo recto. En las últimas décadas, la tendencia está cambiando. No nos referimos tanto a la sensibilidad creciente por conseguir una mayor eficiencia energética que ha llevado a una puesta en valor de la arquitectura vernácula, trasladando su sabiduría a las estrategias bioclimáticas. Nos referimos a un caso específico dentro del amplio abanico de formas arquitectónicas que han aparecido gracias a la incorporación de las herramientas computacionales en el diseño y la producción. Las arquitecturas que nos interesan son las que aprovechan estas técnicas para analizar e interpretar las estrategias complejas y altamente eficientes que encontramos en la naturaleza, y trasladarlas a la disciplina arquitectónica. Esta tendencia que se enmarca dentro de la Biomímesis o Biomimética es conocida con el nombre de Bioarquitectura. La presente tesis trata de morfología y sobre todo de morfogénesis. El término morfología se refiere al estudio de una forma concreta que nos permite entender un caso específico, nuestro foco de atención se centra sin embargo en la morfogénesis, es decir, en el estudio de los procesos de generación de esas formas, para poder reproducir patrones y generar abanicos de casos adaptables y reconfigurables. El hecho de estudiar la forma no quiere decir que ésta sea una tesis “formalista” con la connotación peyorativa y gestual que se le suele atribuir a este término. La investigación concibe el concepto de forma como lo hace el mundo natural: forma como síntesis de eficiencia. No hay ninguna forma natural gratuita, que no cumpla una función determinada y que no se desarrolle con el mínimo material y gaste la mínima energía posible. Este afán por encontrar la “forma eficaz” es lo que nos hace traspasar la frontera de la arquitectura formalista. El camino de investigación morfológica se traza, como el título de la tesis indica, siguiendo el hilo conductor concreto de los radiolarios. Estos microorganismos unicelulares poseen unos esqueletos tan complejos que para poder entender su morfología es necesario establecer un amplio recorrido que abarca más de 4.000 años de conocimiento humano. Desde el descubrimiento de los sólidos platónicos, poliedros que configuran muchas de las formas globales de estos esqueletos; hasta la aplicación de los algoritmos generativos, que permiten entender y reproducir los patrones de comportamiento que existen detrás de los sistemas de compactación y teselación irregular de los esqueletos radiolarios. La tesis no pretende plantear el problema desde un punto de vista biológico, ni paleontológico, aunque inevitablemente en el primer capítulo se realiza un análisis referenciado del estado del conocimiento científico actual. Sí se analizan en mayor profundidad cuestiones morfológicas y se tratan los diferentes posicionamientos desde los cuales estos microorganismos han servido de referencia en la disciplina arquitectónica. Además encontramos necesario analizar otros patrones naturales que comparten estrategias generativas con los esqueletos radiolarios. Como ya hemos apuntado, en el segundo capítulo se aborda un recorrido desde las geometrías más básicas a las más complejas, que tienen relación con las estrategias de generación de las formas detectadas en los microorganismos. A su vez, el análisis de estas geometrías se intercala con ejemplos de aplicaciones dentro de la arquitectura, el diseño y el arte. Finalizando con un cronograma que sintetiza y relaciona las tres vías de investigación abordadas: natural, geométrica y arquitectónica. Tras los dos capítulos centrales, el capítulo final recapitula las estrategias analizadas y aplica el conocimiento adquirido en la tesis, mediante la realización de diferentes prototipos que abarcan desde el dibujo analítico tradicional, a la fabricación digital y el diseño paramétrico, pasando por modelos analógicos de escayola, barras metálicas, resina, silicona, látex, etc. ABSTRACT One of the most important issues in the contemporary debate, is the one concerning the long-term sustainability of society as we understand it today. The human being is recovering the lost sensitivity that conceived us as part of the natural cycle of life. We have finally understood that we cannot be self-sufficient and independent of the natural environment which surrounds us. Beyond respect and care, we’ll find that the gateway to the infinite knowledge that nature provides us at all levels and at all scales is open. Within the architectural discipline, there have been remarkable examples such as Antoni Gaudí or Frei Otto who have inspired their work in the natural world. Both, found in nature the strategies and basis of their architectural designs. However, they have been a minority within the huge cast of architects defenders of the right angle. In recent decades, the trend is changing. We are not referring to the growing sensitivity in trying to achieve energy efficiency that has led to an enhancement of vernacular architecture, transferring its wisdom to bioclimatic strategies. We refer to a specific case within the wide range of architectural forms that have appeared thanks to the integration of computer tools in both design and production processes. We are interested in architectures that exploit these techniques to analyse and interpret the complex and highly efficient strategies found in nature, and shift them to the discipline of architecture. This trend, which is being implemented in the framework of the Biomimicry or biomimetics, is called Bioarchitecture. This thesis deals with morphology and more specifically with morphogenesis. Morphology is the study of a concrete form that allows us to understand a specific case. However, our focus is centered in morphogenesis or, in other words, the study of the processes of generation of these forms, in order to replicate patterns and generate a range of adaptable and reconfigurable cases. The fact of studying shapes does not mean that this is a “formalistic” thesis with the pejorative connotation that is often attributed to this term. This study conceives the concept of shape as Nature does: as a synthesis of efficiency. There is no meaningless form in nature. Furthermore, forms and shapes in nature play a particular role and are developed with minimum energetic consumption. This quest to find the efficient shape is what makes us go beyond formalistic architecture. The road of morphological investigation is traced, as the title of the thesis suggests, following the thread of radiolaria. These single-cell microorganisms possess very complex skeletons, so to be able to understand their morphology we must establish a wide spectrum which spans throughout more than 4.000 years of human knowledge. From the discovery of the platonic solids, polyhedrons which configure a huge range of global shapes of these skeletons, through the application of generative algorithms which allow us to understand and recreate the behavioral patterns behind the systems of compression and irregular tessellation of the radiolarian skeletons. The thesis does not pretend to lay out the problem from a biological, paleontological standpoint, although inevitably the first chapter is developed through an analysis in reference to the current state of the science. A deeper analysis of morphological aspects and different positionings is taken into account where these microorganisms have served as reference in the architectonic discipline. In addition we find necessary to analyse other natural patterns which share generative strategies with radiolarian skeletons. Aforementioned, in the second chapter an itinerary of the most basic geometries to the more complex ones is addressed. These are related, in this chapter, to the generative strategies of the shapes found in microorganisms. At the same time, the analysis of these geometries is placed among examples of applications inside the fields of architecture, design and the arts. To come to an end, a time chart synthesizes and relates the three investigation paths addressed: natural, geometrical and architectonic. After the two central chapters, the final chapter summarises the strategies analysed and applies the knowledge acquired throughout the thesis. This final chapter is shaped by the realization of different prototypes which range from traditional analytical drawings, to digital fabrication and parametric design, going through plaster analogical models, metal bars, resin, silicone, latex, etc.
Resumo:
The development of multi-target drugs for treating complex multifactorial diseases constitutes an active research ield. This kind of drugs has gained much importance as alternative strategy to combination therapy (“cocktail drugs”).1 A common way to design them brings together two different pharmacophores in one single molecule (so-called dyads). Following this idea and being aware that xanthones2 and 1,2,3-triazoles3 possess important pharmacological properties, we combined these two heterocycles in one molecule to create new dyads with improved therapeutic potential. In this work, new xanthone-1,2,3-triazole dyads were prepared from novel (E)-2-(4-arylbut-1-en-3-yn-1-yl)chromones by two different approaches to evaluate their eficiency and sustainability. Both methodologies involved Diels-Alder reactions to build the xanthone core, which were optimized using microwave irradiation as alternative heating method, and 1,3-dipolar cycloadditions to insert the 1,2,3-triazole moiety (Figure 1).4 All final and intermediate compounds were fully characterized by 1D and 2D NMR techniques.
Resumo:
This dissertation was primarily engaged in the study of linear and organic perspective applied to the drawing of landscape, considering the perspective as a fundamental tool in order to graphically materialize sensory experiences offered by the landscape / place to be drawn. The methodology consisted initially in the investigation of perspective theories and perspective representation methods applied to landscape drawing, followed by practical application to a specific case. Thus, within the linear perspective were analyzed and explained: the visual framing, the methods of representation based on the descriptive geometry and also the design of shadows and reflections within the shadows. In the context of organic perspective were analyzed and described techniques utilizing depth of field, the color, or fading and overlapping and light-dark so as to add depth to the drawing. It was also explained a set of materials, printing techniques and resources, which by means of practical examples executed by different artists over time, show the perspectives’ drawings and application of theory. Finally, a set of original drawings was prepared in order to represent a place of a specific case, using for this purpose the theories and methods of linear and organic perspective, using different materials and printing techniques. The drawings were framed under the "project design", starting with the horizontal and vertical projections of a landscape architecture design to provide different views of the proposed space. It can be concluded that the techniques and methods described and exemplified, were suitable, with some adjustments, to the purpose it was intended, in particular in the landscape design conception, bringing to reality the pictorial sense world perceived by the human eye
Resumo:
Le papier bioactif est obtenu par la modification de substrat du papier avec des biomolécules et des réactifs. Ce type de papier est utilisé dans le développement de nouveaux biocapteurs qui sont portables, jetables et économiques visant à capturer, détecter et dans certains cas, désactiver les agents pathogènes. Généralement les papiers bioactifs sont fabriqués par l’incorporation de biomolécules telles que les enzymes et les anticorps sur la surface du papier. L’immobilisation de ces biomolécules sur les surfaces solides est largement utilisée pour différentes applications de diagnostic comme dans immunocapteurs et immunoessais mais en raison de la nature sensible des enzymes, leur intégration au papier à grande échelle a rencontré plusieurs difficultés surtout dans les conditions industrielles. Pendant ce temps, les microcapsules sont une plate-forme intéressante pour l’immobilisation des enzymes et aussi assez efficace pour permettre à la fonctionnalisation du papier à grande échelle car le papier peut être facilement recouvert avec une couche de telles microcapsules. Dans cette étude, nous avons développé une plate-forme générique utilisant des microcapsules à base d’alginate qui peuvent être appliquées aux procédés usuels de production de papier bioactif et antibactérien avec la capacité de capturer des pathogènes à sa surface et de les désactiver grâce à la production d’un réactif anti-pathogène. La conception de cette plate-forme antibactérienne est basée sur la production constante de peroxyde d’hydrogène en tant qu’agent antibactérien à l’intérieur des microcapsules d’alginate. Cette production de peroxyde d’hydrogène est obtenue par oxydation du glucose catalysée par la glucose oxydase encapsulée à l’intérieur des billes d’alginate. Les différentes étapes de cette étude comprennent le piégeage de la glucose oxydase à l’intérieur des microcapsules d’alginate, l’activation et le renforcement de la surface des microcapsules par ajout d’une couche supplémentaire de chitosan, la vérification de la possibilité d’immobilisation des anticorps (immunoglobulines G humaine comme une modèle d’anticorps) sur la surface des microcapsules et enfin, l’évaluation des propriétés antibactériennes de cette plate-forme vis-à-vis l’Escherichia coli K-12 (E. coli K-12) en tant qu’un représentant des agents pathogènes. Après avoir effectué chaque étape, certaines mesures et observations ont été faites en utilisant diverses méthodes et techniques analytiques telles que la méthode de Bradford pour dosage des protéines, l’électroanalyse d’oxygène, la microscopie optique et confocale à balayage laser (CLSM), la spectrométrie de masse avec désorption laser assistée par matrice- temps de vol (MALDI-TOF-MS), etc. Les essais appropriés ont été effectués pour valider la réussite de modification des microcapsules et pour confirmer à ce fait que la glucose oxydase est toujours active après chaque étape de modification. L’activité enzymatique spécifique de la glucose oxydase après l’encapsulation a été évaluée à 120±30 U/g. Aussi, des efforts ont été faits pour immobiliser la glucose oxydase sur des nanoparticules d’or avec deux tailles différentes de diamètre (10,9 nm et 50 nm) afin d’améliorer l’activité enzymatique et augmenter l’efficacité d’encapsulation. Les résultats obtenus lors de cette étude démontrent les modifications réussies sur les microcapsules d’alginate et aussi une réponse favorable de cette plate-forme antibactérienne concernant la désactivation de E. coli K-12. La concentration efficace de l’activité enzymatique afin de désactivation de cet agent pathogénique modèle a été déterminée à 1.3×10-2 U/ml pour une concentration de 6.7×108 cellules/ml de bactéries. D’autres études sont nécessaires pour évaluer l’efficacité de l’anticorps immobilisé dans la désactivation des agents pathogènes et également intégrer la plate-forme sur le papier et valider l’efficacité du système une fois qu’il est déposé sur papier.