21 resultados para synergy

em Universidad Politécnica de Madrid


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Systematic data on the effect of irradiation with swift ions (Zn at 735 MeV and Xe at 929 MeV) on NaCl single crystals have been analysed in terms of a synergetic two-spike approach (thermal and excitation spikes). The coupling of the two spikes, simultaneously generated by the irradiation, contributes to the operation of a non-radiative exciton decay model as proposed for purely ionization damage. Using this scheme, we have accounted for the π-emission yield of self-trapped excitons and its temperature dependence under ion-beam irradiation. Moreover, the initial production rates of F-centre growth have also been reasonably simulated for irradiation at low temperatures ( < 100 K), where colour centre annealing and aggregation can be neglected.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The image by Computed Tomography is a non-invasive alternative for observing soil structures, mainly pore space. The pore space correspond in soil data to empty or free space in the sense that no material is present there but only fluids, the fluid transport depend of pore spaces in soil, for this reason is important identify the regions that correspond to pore zones. In this paper we present a methodology in order to detect pore space and solid soil based on the synergy of the image processing, pattern recognition and artificial intelligence. The mathematical morphology is an image processing technique used for the purpose of image enhancement. In order to find pixels groups with a similar gray level intensity, or more or less homogeneous groups, a novel image sub-segmentation based on a Possibilistic Fuzzy c-Means (PFCM) clustering algorithm was used. The Artificial Neural Networks (ANNs) are very efficient for demanding large scale and generic pattern recognition applications for this reason finally a classifier based on artificial neural network is applied in order to classify soil images in two classes, pore space and solid soil respectively.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We describe lpdoc, a tool which generates documentation manuals automatically from one or more logic program source files, written in Ciao, ISO-Prolog, and other (C)LP languages. It is particularly useful for documenting library modules, for which it automatically generates a rich description of the module interface. However, it can also be used quite successfully to document full applications. A fundamental advantage of using lpdoc is that it helps maintaining a true correspondence between the program and its documentation, and also identifying precisely to what versión of the program a given printed manual corresponds. The quality of the documentation generated can be greatly enhanced by including within the program text assertions (declarations with types, modes, etc. ...) for the predicates in the program, and machine-readable comments. One of the main novelties of lpdoc is that these assertions and comments are written using the Ciao system asseriion language, which is also the language of communication between the compiler and the user and between the components of the compiler. This allows a significant synergy among specification, debugging, documentation, optimization, etc. A simple compatibility library allows conventional (C)LP systems to ignore these assertions and comments and treat normally programs documented in this way. The documentation can be generated interactively from emacs or from the command line, in many formats including texinfo, dvi, ps, pdf, info, ascii, html/css, Unix nroff/man, Windows help, etc., and can include bibliographic citations and images, lpdoc can also genérate "man" pages (Unix man page format), nicely formatted plain ASCII "readme" files, installation scripts useful when the manuals are included in software distributions, brief descriptions in html/css or info formats suitable for inclusión in on-line Índices of manuals, and even complete WWW and info sites containing on-line catalogs of documents and software distributions. The lpdoc manual, all other Ciao system manuals, and parts of this paper are generated by lpdoc.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We describe lpdoc, a tool which generates documentation manuals automatically from one or more logic program source files, written in ISO-Prolog, Ciao, and other (C)LP languages. It is particularly useful for documenting library modules, for which it automatically generates a rich description of the module interface. However, it can also be used quite successfully to document full applications. A fundamental advantage of using lpdoc is that it helps maintaining a true correspondence between the program and its documentation, and also identifying precisely to what version of the program a given printed manual corresponds. The quality of the documentation generated can be greatly enhanced by including within the program text assertions (declarations with types, modes, etc.) for the predicates in the program, and machine-readable comments. One of the main novelties of lpdoc is that these assertions and comments are written using the Ciao system assertion language, which is also the language of communication between the compiler and the user and between the components of the compiler. This allows a significant synergy among specification, documentation, optimization, etc. A simple compatibility library allows conventional (C)LP systems to ignore these assertions and comments and treat normally programs documented in this way. The documentation can be generated in many formats including texinfo, dvi, ps, pdf, info, html/css, Unix nroff/man, Windows help, etc., and can include bibliographic citations and images. lpdoc can also generate “man” pages (Unix man page format), nicely formatted plain ascii “readme” files, installation scripts useful when the manuals are included in software distributions, brief descriptions in html/css or info formats suitable for inclusion in on-line indices of manuals, and even complete WWW and info sites containing on-line catalogs of documents and software distributions. The lpdoc manual, all other Ciao system manuals, and parts of this paper are generated by lpdoc.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Como consecuencia del proceso de desalación, se produce el vertido al mar de un agua de rechazo hipersalino o salmuera. La salinidad de este vertido es variable, dependiendo del origen de la captación y del proceso de tratamiento. Muchos de los hábitats y biocenosis de los ecosistemas marinos se encuentran adaptados a ambientes de salinidad casi constante y son muy susceptibles a los incrementos de salinidad originados por estos vertidos. Junto con el vertido de salmuera otro de los principales inconvenientes que plantean las plantas desaladoras es el alto consumo energético, con todas las desventajas que esto supone: alto coste del agua desalada para los consumidores, contaminación del medio... El desarrollo de los métodos de vertido, herramientas de gestión de la salmuera, estudios del comportamiento de la pluma salina… ha buscado la mitigación de estos efectos sobre los ecosistemas marinos. El desarrollo en membranas de ósmosis inversa, diseño de bombas y sistemas de recuperación de energía ha permitido también la reducción del consumo energético en las plantas de desalación. Sin embargo, estos campos parecen haber encontrado un techo tecnológico difícil de rebasar en los últimos tiempos. La energía osmótica se plantea como uno de los caminos a investigar aplicado al campo de la reducción del consumo energético en desalación de agua de mar, a través del aprovechamiento energético de la salmuera. Con esta tesis se pretende cumplir principalmente con los siguientes objetivos: reducción del consumo energético en desalación, mitigar el impacto del vertido sobre el medio y ser una nueva herramienta en la gestión de la salmuera. En el presente documento se plantea el desarrollo de un nuevo proceso que utiliza el fenómeno de la ósmosis directa a través de membranas semipermeables, y busca la sinergia desalación depuración, integrando ambos, en un único proceso de tratamiento dentro del ciclo integral del agua. Para verificar los valores de producción, calidad y rendimiento del proceso, se proyecta y construye una planta piloto ubicada en la Planta Desaladora de Alicante II, escalada de tal manera que permite la realización de los ensayos con equipos comerciales de tamaño mínimo. El objetivo es que el resultado final sea extrapolable a tamaños superiores sin que el escalado afecte a la certeza y fiabilidad de las conclusiones obtenidas. La planta se proyecta de forma que el vertido de una desaladora de ósmosis inversa junto con el vertido de un terciario convencional, se pasan por una ósmosis directa y a continuación por una ósmosis inversa otra vez, ésta última con el objeto de abrir la posibilidad de incrementar la producción de agua potable. Ambas ósmosis están provistas de un sistema de pretratamiento físico-químico (para adecuar la calidad del agua de entrada a las condiciones requeridas por las membranas en ambos casos), y un sistema de limpieza química. En todos los ensayos se usa como fuente de disolución concentrada (agua salada), el rechazo de un bastidor de ósmosis inversa de una desaladora convencional de agua de mar. La fuente de agua dulce marca la distinción entre dos tipos de ensayos: ensayos con el efluente del tratamiento terciario de una depuradora convencional, con lo que se estudia el comportamiento de la membrana ante el ensuciamiento; y ensayos con agua permeada, que permiten estudiar el comportamiento ideal de la membrana. Los resultados de los ensayos con agua salobre ponen de manifiesto problemas de ensuciamiento de la membrana, el caudal de paso a través de la misma disminuye con el tiempo y este efecto se ve incrementado con el aumento de la temperatura del agua. Este fenómeno deriva en una modificación del pretratamiento de la ósmosis directa añadiendo un sistema de ultrafiltración que ha permitido que la membrana presente un comportamiento estable en el tiempo. Los ensayos con agua permeada han hecho posible estudiar el comportamiento “ideal” de la membrana y se han obtenido las condiciones óptimas de operación y a las que se debe tender, consiguiendo tasas de recuperación de energía de 1,6; lo que supone pasar de un consumo de 2,44 kWh/m3 de un tren convencional de ósmosis a 2,28 kWh/m3 al añadir un sistema de ósmosis directa. El objetivo de futuras investigaciones es llegar a tasas de recuperación de 1,9, lo que supondría alcanzar consumos inferiores a 2 kWh/m3. Con esta tesis se concluye que el proceso propuesto permite dar un paso más en la reducción del consumo energético en desalación, además de mitigar los efectos del vertido de salmuera en el medio marino puesto que se reduce tanto el caudal como la salinidad del vertido, siendo además aplicable a plantas ya existentes y planteando importantes ventajas económicas a plantas nuevas, concebidas con este diseño. As a consequence of the desalination process, a discharge of a hypersaline water or brine in the sea is produced. The salinity of these discharges varies, depending on the type of intake and the treatment process. Many of the habitats and biocenosis of marine ecosystems are adapted to an almost constant salinity environment and they are very susceptible to salinity increases caused by these discharges. Besides the brine discharge, another problem posed by desalination plants, is the high energy consumption, with all the disadvantages that this involves: high cost of desalinated water for consumers, environmental pollution ... The development of methods of disposal, brine management tools, studies of saline plume ... has sought the mitigation of these effects on marine ecosystems. The development of reverse osmosis membranes, pump design and energy recovery systems have also enabled the reduction of energy consumption in desalination plants. However, these fields seem to have reached a technological ceiling which is difficult to exceed in recent times. Osmotic power is proposed as a new way to achieve the reduction of energy consumption in seawater desalination, through the energy recovery from the brine. This thesis mainly tries to achieve the following objectives: reduction of energy consumption in desalination, mitigation of the brine discharge impact on the environment and become a new tool in the management of the brine. This paper proposes the development of a new process, that uses the phenomenon of forward osmosis through semipermeable membranes and seeks the synergy desalination-wastewater reuse, combining both into a single treatment process within the integral water cycle. To verify the production, quality and performance of the process we have created a pilot plant. This pilot plant, located in Alicante II desalination plant, has been designed and built in a scale that allows to carry out the tests with minimum size commercial equipment. The aim is that the results can be extrapolated to larger sizes, preventing that the scale affects the accuracy and reliability of the results. In the projected plant, the discharge of a reverse osmosis desalination plant and the effluent of a convencional tertiary treatment of a wastewater plant, go through a forward osmosis module, and then through a reverse osmosis, in order to open the possibility of increasing potable water production. Both osmosis systems are provided with a physicochemical pretreatment (in order to obtain the required conditions for the membranes in both cases), and a chemical cleaning system. In all tests, it is used as a source of concentrated solution (salt water), the rejection of a rack of a conventional reverse osmosis seawater desalination. The source of fresh water makes the difference between two types of tests: test with the effluent from a tertiary treatment of a conventional wastewater treatment plant (these tests study the behavior of the membrane facing the fouling) and tests with permeate, which allow us to study the ideal behavior of the membrane. The results of the tests with brackish water show fouling problems, the flow rate through the membrane decreases with the time and this effect is increased with water temperature. This phenomenon causes the need for a modification of the pretreatment of the direct osmosis module. An ultrafiltration system is added to enable the membrane to present a stable behavior . The tests with permeate have made possible the study of the ideal behavior of the membrane and we have obtained the optimum operating conditions. We have achieved energy recovery rates of 1.6, which allows to move from a consumption of 2.44 kWh/m3 in a conventional train of reverse osmosis to 2.28 kWh / m3 if it is added the direct osmosis system. The goal of future researches is to achieve recovery rates of 1.9, which would allow to reach a consumption lower than 2 kWh/m3. This thesis concludes that the proposed process allows us to take a further step in the reduction of the energy consumption in desalination. We must also add the mitigation of the brine discharge effects on the marine environment, due to the reduction of the flow and salinity of the discharge. This is also applicable to existing plants, and it suggests important economic benefits to new plants that will be built with this design.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Social software tools have become an integral part of students? personal lives and their primary communication medium. Likewise, these tools are increasingly entering the enterprise world (within the recent trend known as Enterprise 2.0) and becoming a part of everyday work routines. Aiming to keep the pace with the job requirements and also to position learning as an integral part of students? life, the field of education is challenged to embrace social software. Personal Learning Environments (PLEs) emerged as a concept that makes use of social software to facilitate collaboration, knowledge sharing, group formation around common interests, active participation and reflective thinking in online learning settings. Furthermore, social software allows for establishing and maintaining one?s presence in the online world. By being aware of a student's online presence, a PLE is better able to personalize the learning settings, e.g., through recommendation of content to use or people to collaborate with. Aiming to explore the potentials of online presence for the provision of recommendations in PLEs, in the scope of the OP4L project, we have develop a software solution that is based on a synergy of Semantic Web technologies, online presence and socially-oriented learning theories. In this paper we present the current results of this research work.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Over the last decade, Grid computing paved the way for a new level of large scale distributed systems. This infrastructure made it possible to securely and reliably take advantage of widely separated computational resources that are part of several different organizations. Resources can be incorporated to the Grid, building a theoretical virtual supercomputer. In time, cloud computing emerged as a new type of large scale distributed system, inheriting and expanding the expertise and knowledge that have been obtained so far. Some of the main characteristics of Grids naturally evolved into clouds, others were modified and adapted and others were simply discarded or postponed. Regardless of these technical specifics, both Grids and clouds together can be considered as one of the most important advances in large scale distributed computing of the past ten years; however, this step in distributed computing has came along with a completely new level of complexity. Grid and cloud management mechanisms play a key role, and correct analysis and understanding of the system behavior are needed. Large scale distributed systems must be able to self-manage, incorporating autonomic features capable of controlling and optimizing all resources and services. Traditional distributed computing management mechanisms analyze each resource separately and adjust specific parameters of each one of them. When trying to adapt the same procedures to Grid and cloud computing, the vast complexity of these systems can make this task extremely complicated. But large scale distributed systems complexity could only be a matter of perspective. It could be possible to understand the Grid or cloud behavior as a single entity, instead of a set of resources. This abstraction could provide a different understanding of the system, describing large scale behavior and global events that probably would not be detected analyzing each resource separately. In this work we define a theoretical framework that combines both ideas, multiple resources and single entity, to develop large scale distributed systems management techniques aimed at system performance optimization, increased dependability and Quality of Service (QoS). The resulting synergy could be the key 350 J. Montes et al. to address the most important difficulties of Grid and cloud management.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper reports on an innovative approach that aims to reduce information management costs in data-intensive and cognitively-complex biomedical environments. Recognizing the importance of prominent high-performance computing paradigms and large data processing technologies as well as collaboration support systems to remedy data-intensive issues, it adopts a hybrid approach by building on the synergy of these technologies. The proposed approach provides innovative Web-based workbenches that integrate and orchestrate a set of interoperable services that reduce the data-intensiveness and complexity overload at critical decision points to a manageable level, thus permitting stakeholders to be more productive and concentrate on creative activities.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Tool wear detection is a key issue for tool condition monitoring. The maximization of useful tool life is frequently related with the optimization of machining processes. This paper presents two model-based approaches for tool wear monitoring on the basis of neuro-fuzzy techniques. The use of a neuro-fuzzy hybridization to design a tool wear monitoring system is aiming at exploiting the synergy of neural networks and fuzzy logic, by combining human reasoning with learning and connectionist structure. The turning process that is a well-known machining process is selected for this case study. A four-input (i.e., time, cutting forces, vibrations and acoustic emissions signals) single-output (tool wear rate) model is designed and implemented on the basis of three neuro-fuzzy approaches (inductive, transductive and evolving neuro-fuzzy systems). The tool wear model is then used for monitoring the turning process. The comparative study demonstrates that the transductive neuro-fuzzy model provides better error-based performance indices for detecting tool wear than the inductive neuro-fuzzy model and than the evolving neuro-fuzzy model.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Collaborative efforts between the Neutronics and Target Design Group at the Instituto de Fusión Nuclear and the Molecular Spectroscopy Group at the ISIS Pulsed Neutron and Muon Source date back to 2012 in the context of the ESS-Bilbao project. The rationale for these joint activities was twofold, namely: to assess the realm of applicability of the low-energy neutron source proposed by ESS-Bilbao - for details; and to explore instrument capabilities for pulsed-neutron techniques in the range 0.05-3 ms, a time range where ESS-Bilbao and ISIS could offer a significant degree of synergy and complementarity. As part of this collaboration, J.P. de Vicente has spent a three-month period within the ISIS Molecular Spectroscopy Group, to gain hands-on experience on the practical aspects of neutron-instrument design and the requisite neutron-transport simulations. To date, these activities have resulted in a joint MEng thesis as well as a number of publications and contributions to national and international conferences. Building upon these previous works, the primary aim of this report is to provide a self-contained discussion of general criteria for instrument selection at ESS-Bilbao, the first accelerator-driven, low-energy neutron source designed in Spain. To this end, Chapter 1 provides a brief overview of the current design parameters of the accelerator and target station. Neutron moderation is covered in Chapter 2, where we take a closer look at two possible target-moderator-reflector configurations and pay special attention to the spectral and temporal characteristics of the resulting neutron pulses. This discussion provides a necessary starting point to assess the operation of ESSB in short- and long-pulse modes. These considerations are further explored in Chapter 3, dealing with the primary characteristics of ESS-Bilbao as a short- or long-pulse facility in terms of accessible dynamic range and spectral resolution. Other practical aspects including background suppression and the use of fast choppers are also discussed. The guiding principles introduced in the first three chapters are put to use in Chapter 4 where we analyse in some detail the capabilities of a small-angle scattering instrument, as well as how specific scientific requirements can be mapped onto the optimal use of ESS-Bilbao for condensed-matter research. Part 2 of the report contains additional supporting documentation, including a description of the ESSB McStas component, a detailed characterisation of moderator response and neutron pulses, and estimates ofparameters associated with the design and operation of neutron choppers. In closing this brief foreword, we wish to thank both ESS-Bilbao and ISIS for their continuing encouragement and support along the way.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Los arquitectos se han sentido progresivamente inclinados a incorporar superficies de vidrio cada vez mayores en sus proyectos de arquitectura, en correspondencia con una percepción socio-cultural del vidrio vinculada al progreso, la contemporaneidad y el bienestar, así como por la versatilidad de este material para expresar aspectos de la identidad del proyecto, establecer comunicación con el entorno y actuar como un escaparate para las tecnologías emergentes. A pesar de esta receptividad para acoger los sistemas tecnológicos más avanzados, la envolvente de vidrio contemporánea muy raramente integra tecnología avanzada para el control de la luz natural. Desde la arquitectura, el proyecto de la luz natural a través de la superficie de vidrio se ha explorado muy escasamente, aún cuando en las últimas tres décadas se haya producido una gran diversidad de soluciones tecnológicas para este propósito. Uno de los motivos principales para esta falta de sinergia es la inconsistencia conceptual que impulsa a los procesos proyectuales de la arquitectura y a los desarrollos tecnológicos para la sostenibilidad. Por un lado, las especificaciones de las tecnologías del control de la luz natural se determinan fundamentalmente desde una perspectiva científica de la eficiencia, que no tiene en consideración otros intereses y preocupaciones arquitectónicos. Por otro lado, la práctica arquitectónica no ha asimilado un pensamiento técnico en torno a la luz natural que lo determine como un agente clave del proceso proyectual, incluso cuando la sostenibilidad se perfile como la fuerza que ha de liderar la arquitectura del futuro y, en este sentido, sea una prioridad absoluta minimizar las consecuencias económicas y ecológicas del impacto negativo del vidrio. Por medio del escrutinio de valores culturales, proyectuales, funcionales y ecológicos, esta tesis aborda el estudio del precario diálogo transdisciplinar entre la evolución de la envolvente de vidrio en la arquitectura contemporánea y el desarrollo de soluciones tecnológicas para el proyecto de la luz natural, e identifica sus principales puntos de divergencia como los temas centrales desde los que proyectar con vidrio en una arquitectura sostenible futura. Desde una perspectiva energética, este ejercicio es un paso crítico para concienciar sobre la gravedad de la situación presente y establecer los cimientos para líneas de intervención esenciales para hacer a ambos mundos converger. Desde la óptica arquitectónica, este estudio representa además de una oportunidad para entender los potenciales proyectuales de estas tecnologías y reflexionar sobre la relación vidrio-luz, un escenario desde el que comprender el estatus incongruente de la sostenibilidad tecnológica en la arquitectura actual, contribuyendo a que se genere una contextualización recíproca entre la investigación en energía y la práctica de la arquitectura futura. ABSTRACT Architects are increasingly demanded to incorporate extensive glazed areas in buildings in correspondence with a socio-cultural perception of glass linked with progress, contemporaneity and welfare, as well as for this material’s versatility to express identity features, establish communication with its environment, and perform as a showroom for emergent technologies. Despite this disposition to take cutting-edge technology in, the contemporary glass envelope very scarcely integrates advanced daylight control technology. From an architectural standpoint, the exploration of the manipulation of natural light through the glass surface has been very swallow, even though a wide range of technical solutions has being produced in the last three decades for this purpose. One of the core issues behind this inconsistency is the lack of established synergy between architectural design processes and sustainable technological developments. From one side, the specifications of daylighting technologies are primarily determined by a scientific perspective of efficiency and disregard fundamental architectural concerns and interests. From another, architectural practice does not conceive sustainable technologies as key active agents in the design process, despite the fact the concept of sustainability is constantly regarded as the driving force of the leading-edge architecture of the future, and in this sense, it becomes an absolute priority to minimize the ecological and economical consequences of glass decisive impact in buildings. Through the scrutiny of cultural, functional and ecological values, this thesis analyses the precarious transdisciplinary dialogue between the evolution of the glass envelope in contemporary architecture and the development of daylighting technological solutions, and identifies the core affairs necessary to a sustainable integration of glass facades into future architecture. From an energy point of view, this exercise is a critical step to raise awareness about the severity of the present situation, and to establish the underpinnings for new lines of intervention essential to make both worlds efficiently converge. Architecturally speaking, in addition to the opportunity to understand the design potentials of these technologies and reflect on the relationship glasslight, this study contributes with a scenario from which generate the reciprocal contextualization of energy building research to future architectural practices.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

La investigación para el conocimiento del cerebro es una ciencia joven, su inicio se remonta a Santiago Ramón y Cajal en 1888. Desde esta fecha a nuestro tiempo la neurociencia ha avanzado mucho en el desarrollo de técnicas que permiten su estudio. Desde la neurociencia cognitiva hoy se explican muchos modelos que nos permiten acercar a nuestro entendimiento a capacidades cognitivas complejas. Aun así hablamos de una ciencia casi en pañales que tiene un lago recorrido por delante. Una de las claves del éxito en los estudios de la función cerebral ha sido convertirse en una disciplina que combina conocimientos de diversas áreas: de la física, de las matemáticas, de la estadística y de la psicología. Esta es la razón por la que a lo largo de este trabajo se entremezclan conceptos de diferentes campos con el objetivo de avanzar en el conocimiento de un tema tan complejo como el que nos ocupa: el entendimiento de la mente humana. Concretamente, esta tesis ha estado dirigida a la integración multimodal de la magnetoencefalografía (MEG) y la resonancia magnética ponderada en difusión (dMRI). Estas técnicas son sensibles, respectivamente, a los campos magnéticos emitidos por las corrientes neuronales, y a la microestructura de la materia blanca cerebral. A lo largo de este trabajo hemos visto que la combinación de estas técnicas permiten descubrir sinergias estructurofuncionales en el procesamiento de la información en el cerebro sano y en el curso de patologías neurológicas. Más específicamente en este trabajo se ha estudiado la relación entre la conectividad funcional y estructural y en cómo fusionarlas. Para ello, se ha cuantificado la conectividad funcional mediante el estudio de la sincronización de fase o la correlación de amplitudes entre series temporales, de esta forma se ha conseguido un índice que mide la similitud entre grupos neuronales o regiones cerebrales. Adicionalmente, la cuantificación de la conectividad estructural a partir de imágenes de resonancia magnética ponderadas en difusión, ha permitido hallar índices de la integridad de materia blanca o de la fuerza de las conexiones estructurales entre regiones. Estas medidas fueron combinadas en los capítulos 3, 4 y 5 de este trabajo siguiendo tres aproximaciones que iban desde el nivel más bajo al más alto de integración. Finalmente se utilizó la información fusionada de MEG y dMRI para la caracterización de grupos de sujetos con deterioro cognitivo leve, la detección de esta patología resulta relevante en la identificación precoz de la enfermedad de Alzheimer. Esta tesis está dividida en seis capítulos. En el capítulos 1 se establece un contexto para la introducción de la connectómica dentro de los campos de la neuroimagen y la neurociencia. Posteriormente en este capítulo se describen los objetivos de la tesis, y los objetivos específicos de cada una de las publicaciones científicas que resultaron de este trabajo. En el capítulo 2 se describen los métodos para cada técnica que fue empleada: conectividad estructural, conectividad funcional en resting state, redes cerebrales complejas y teoría de grafos y finalmente se describe la condición de deterioro cognitivo leve y el estado actual en la búsqueda de nuevos biomarcadores diagnósticos. En los capítulos 3, 4 y 5 se han incluido los artículos científicos que fueron producidos a lo largo de esta tesis. Estos han sido incluidos en el formato de la revista en que fueron publicados, estando divididos en introducción, materiales y métodos, resultados y discusión. Todos los métodos que fueron empleados en los artículos están descritos en el capítulo 2 de la tesis. Finalmente, en el capítulo 6 se concluyen los resultados generales de la tesis y se discuten de forma específica los resultados de cada artículo. ABSTRACT In this thesis I apply concepts from mathematics, physics and statistics to the neurosciences. This field benefits from the collaborative work of multidisciplinary teams where physicians, psychologists, engineers and other specialists fight for a common well: the understanding of the brain. Research on this field is still in its early years, being its birth attributed to the neuronal theory of Santiago Ramo´n y Cajal in 1888. In more than one hundred years only a very little percentage of the brain functioning has been discovered, and still much more needs to be explored. Isolated techniques aim at unraveling the system that supports our cognition, nevertheless in order to provide solid evidence in such a field multimodal techniques have arisen, with them we will be able to improve current knowledge about human cognition. Here we focus on the multimodal integration of magnetoencephalography (MEG) and diffusion weighted magnetic resonance imaging. These techniques are sensitive to the magnetic fields emitted by the neuronal currents and to the white matter microstructure, respectively. The combination of such techniques could bring up evidences about structural-functional synergies in the brain information processing and which part of this synergy fails in specific neurological pathologies. In particular, we are interested in the relationship between functional and structural connectivity, and how two integrate this information. We quantify the functional connectivity by studying the phase synchronization or the amplitude correlation between time series obtained by MEG, and so we get an index indicating similarity between neuronal entities, i.e. brain regions. In addition we quantify structural connectivity by performing diffusion tensor estimation from the diffusion weighted images, thus obtaining an indicator of the integrity of the white matter or, if preferred, the strength of the structural connections between regions. These quantifications are then combined following three different approaches, from the lowest to the highest level of integration, in chapters 3, 4 and 5. We finally apply the fused information to the characterization or prediction of mild cognitive impairment, a clinical entity which is considered as an early step in the continuum pathological process of dementia. The dissertation is divided in six chapters. In chapter 1 I introduce connectomics within the fields of neuroimaging and neuroscience. Later in this chapter we describe the objectives of this thesis, and the specific objectives of each of the scientific publications that were produced as result of this work. In chapter 2 I describe the methods for each of the techniques that were employed, namely structural connectivity, resting state functional connectivity, complex brain networks and graph theory, and finally, I describe the clinical condition of mild cognitive impairment and the current state of the art in the search for early biomarkers. In chapters 3, 4 and 5 I have included the scientific publications that were generated along this work. They have been included in in their original format and they contain introduction, materials and methods, results and discussion. All methods that were employed in these papers have been described in chapter 2. Finally, in chapter 6 I summarize all the results from this thesis, both locally for each of the scientific publications and globally for the whole work.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The European chestnut (Castanea sativa Mill.) is a multipurpose species that has been widely cultivated around the Mediterranean basin since ancient times. New varieties were brought to the Iberian Peninsula during the Roman Empire, which coexist since then with native populations that survived the last glaciation. The relevance of chestnut cultivation has being steadily growing since the Middle Ages, until the rural decline of the past century put a stop to this trend. Forest fires and diseases were also major factors. Chestnut cultivation is gaining momentum again due to its economic (wood, fruits) and ecologic relevance, and represents currently an important asset in many rural areas of Europe. In this Thesis we apply different molecular tools to help improve current management strategies. For this study we have chosen El Bierzo (Castile and Leon, NW Spain), which has a centenary tradition of chestnut cultivation and management, and also presents several unique features from a genetic perspective (next paragraph). Moreover, its nuts are widely appreciated in Spain and abroad for their organoleptic properties. We have focused our experimental work on two major problems faced by breeders and the industry: the lack of a fine-grained genetic characterization and the need for new strategies to control blight disease. To characterize with sufficient detail the genetic diversity and structure of El Bierzo orchards, we analyzed DNA from 169 trees grafted for nut production covering the entire region. We also analyzed 62 nuts from all traditional varieties. El Bierzo constitutes an outstanding scenario to study chestnut genetics and the influence of human management because: (i) it is located at one extreme of the distribution area; (ii) it is a major glacial refuge for the native species; (iii) it has a long tradition of human management (since Roman times, at least); and (iv) its geographical setting ensures an unusual degree of genetic isolation. Thirteen microsatellite markers provided enough informativeness and discrimination power to genotype at the individual level. Together with an unexpected level of genetic variability, we found evidence of genetic structure, with three major gene pools giving rise to the current population. High levels of genetic differentiation between groups supported this organization. Interestingly, genetic structure does not match with spatial boundaries, suggesting that the exchange of material and cultivation practices have strongly influenced natural gene flow. The microsatellite markers selected for this study were also used to classify a set of 62 samples belonging to all traditional varieties. We identified several cases of synonymies and homonymies, evidencing the need to substitute traditional classification systems with new tools for genetic profiling. Management and conservation strategies should also benefit from these tools. The avenue of high-throughput sequencing technologies, combined with the development of bioinformatics tools, have paved the way to study transcriptomes without the need for a reference genome. We took advantage of RNA sequencing and de novo assembly tools to determine the transcriptional landscape of chestnut in response to blight disease. In addition, we have selected a set of candidate genes with high potential for developing resistant varieties via genetic engineering. Our results evidenced a deep transcriptional reprogramming upon fungal infection. The plant hormones ET and JA appear to orchestrate the defensive response. Interestingly, our results also suggest a role for auxins in modulating such response. Many transcription factors were identified in this work that interact with promoters of genes involved in disease resistance. Among these genes, we have conducted a functional characterization of a two major thaumatin-like proteins (TLP) that belongs to the PR5 family. Two genes encoding chestnut cotyledon TLPs have been previously characterized, termed CsTL1 and CsTL2. We substantiate here their protective role against blight disease for the first time, including in silico, in vitro and in vivo evidence. The synergy between TLPs and other antifungal proteins, particularly endo-p-1,3-glucanases, bolsters their interest for future control strategies based on biotechnological approaches.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The study brings new insights on the hydrogen assisted stress corrosion on damage tolerance of a high-strength duplex stainless steel wire which concerns its potential use as active reinforcement for concrete prestressing. The adopted procedure was to experimentally state the effect of hydrogen on the damage tolerance of cylindrical smooth and precracked wire specimens exposed to stress corrosion cracking using the aggressive medium of the standard test developed by FIP (International Prestressing Federation). Stress corrosion testing, mechanical fracture tests and scanning electron microscopy analysis allowed the damage assessment, and explain the synergy between mechanical loading and environment action on the failure sequence of the wire. In presence of previous damage, hydrogen affects the wire behavior in a qualitative sense, consistently to the fracture anisotropy attributable to cold drawing, but it does not produce quantitative changes since the steel fully preserves its damage tolerance.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

La presente Tesis Doctoral establece, con criterios científico-técnicos y como primera aproximación, una metodología para evaluar la protección ante los riesgos naturales que proporciona la restauración hidrológico-forestal de las cuencas hidrográficas de montaña, a los habitantes en ellas y a los transeúntes por las mismas. La investigación se ha planificado dividida en tres secciones en las que se analizan: 1) la protección que proporcionan las cubiertas forestales, tanto si son de regeneración natural o si proceden de reforestación; 2) la que se consigue con las obras ejecutadas en las propias cuencas y sus cauces de drenaje, que en el ámbito de la restauración hidrológico-forestal se vinculan con las reforestaciones, por lo que se hace intervenir a éstas en su evaluación y 3) la que se obtiene con las sinergias que surgen a lo largo de la consolidación de las reforestaciones y de las obras ejecutadas en la cuenca, cumpliendo con el proyecto para su restauración hidrológico-forestal; que se estiman en función del grado de cumplimiento de los objetivos específicos del mismo. La incidencia de las cubiertas forestales en el control de los riesgos naturales en la montaña se ha evaluado: a) teniendo en cuenta las experiencias de las investigaciones sobre la materia desarrolladas en la última década en el área alpina y b) analizando las características dasocráticas de las cubiertas forestales objeto de la investigación y, en función de ellas, identificando los parámetros más representativos que intervienen en el control de los principales riesgos naturales en la montaña (crecidas torrenciales, aludes, deslizamientos del terreno y caídas de bloques). La protección aportada por las obras de corrección se ha evaluado, considerado a las cuencas en las que están ubicadas como unidades específicas de corrección y analizando su comportamiento ante el mayor número de eventos torrenciales posible (que se han definido a partir de todas las precipitaciones registradas en las estaciones meteorológicas de serie histórica más larga, situadas en la cuenca en cuestión o más próximas a ella) y verificando a continuación incidencias que hayan ocurrido en la cuenca y el estado en que han quedado las obras. Con la evaluación de las sinergias surgidas a lo largo de la consolidación del proyecto de restauración, se ha tratado de precisar el grado de cumplimiento de sus principales objetivos; teniendo en cuenta que los resultados del proyecto, por su propia dinámica, se experimentan a medio y largo plazo; intervalo en el que pueden surgir distintos imponderables. En cualquier caso, la restauración de las cuencas de montaña no implica la desaparición en ellas de todos de los riesgos; sino un control de éstos y la consiguiente reducción de sus efectos. Por lo que es necesario realizar trabajos de mantenimiento de las reforestaciones y de las obras ejecutadas en ellas, para que conserven las condiciones de protección inicialmente diseñadas. La metodología se ha aplicado en cinco escenarios del Pirineo Aragonés; tres en los que en el pasado se efectuaron trabajos y obras de restauración hidrológico-forestal (las cuencas vertientes a los torrentes de Arratiecho y de Arás y el paraje de Los Arañones) y otros dos que no fueron intervenidos (la ladera de la margen derecha vertiente al cauce de Canal Roya y la ladera de solana de la cabera de la cuenca de Fondo de Pineta) que sirvan de contraste con los anteriores. ABSTRACT The present Thesis establish a methodology in first approach with scientist and technical criteria to assess the protection of persons provided by the water and forest restoration before natural risks in the mountain watersheds. The research has been planned into three sections where it is analysed: 1) the protection provided by the forest cover itself, either it comes from natural regeneration or reforestation; 2) the protection provided by the works executed within the watersheds and in the drainage channels, which it is bound together with the reforestations of water and forest restorations, assessing both effects at a time; and 3) the protection provided by the synergy that arises along the consolidation of the reforestations and the woks executed in the watersheds as the water and forest restoration project considered. This is estimated according the degree of accomplishment of its specific objectives. The impact of the forest covers in the control of natural risks in the mountain has been assessed: a) having into account the experience in the research about the topic developed in the last decades in the alpine area, and b) analysing the dasocratic characteristics of the forest covers and identifying the more representative parameters that take part in the control of the main natural risks in the mountain (torrential rises, avalanches, landslides and rock falls). The protection supplied by the correction works has been assessed considering the watershed as the specific correction unit, as well as analysing their behaviour before the largest number of torrential events possible. Those were defined from the precipitation recorded in the meteorological stations placed within or the closest to the watershed with long historic data. Then the incidents presented in the watershed and the state of the works are verified. The grade of accomplishment of the main objectives has been specified with the evaluation of the synergies raised along the restoration project. It has to be taken into account that the project has its own dynamics and its results show in mid and long term during a period with events unexpected. In any case, the restoration of the mountain basins doesn't imply the disappearance of all risk, but a control of them and the reduction of their effects. Then, it is necessary maintenance of the reforestations and of the works executed to conserve the protection conditions originally designed. The methodology has been applied into five scenes in the Aragonese Pyrenees; three in which works and water and forest restorations were executed in the past (watershed of Arratiecho and Aras torrents, and the Arañones location), and other two without any intervention that make contrast (the right hill-slope of Canal Roya and the south hill-slope of the headwaters of Pineta valley).