557 resultados para Ensembles semilinéaires


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Modern cloud-based applications and infrastructures may include resources and services (components) from multiple cloud providers, are heterogeneous by nature and require adjustment, composition and integration. The specific application requirements can be met with difficulty by the current static predefined cloud integration architectures and models. In this paper, we propose the Intercloud Operations and Management Framework (ICOMF) as part of the more general Intercloud Architecture Framework (ICAF) that provides a basis for building and operating a dynamically manageable multi-provider cloud ecosystem. The proposed ICOMF enables dynamic resource composition and decomposition, with a main focus on translating business models and objectives to cloud services ensembles. Our model is user-centric and focuses on the specific application execution requirements, by leveraging incubating virtualization techniques. From a cloud provider perspective, the ecosystem provides more insight into how to best customize the offerings of virtualized resources.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An important key for the understanding of the dynamic response to large tropical volcanic eruptions is the warming of the tropical lower stratosphere and the concomitant intensification of the polar vortices. Although this mechanism is reproduced by most general circulation models today, most models still fail in producing an appropriate winter warming pattern in the Northern Hemisphere. In this study ensemble sensitivity experiments were carried out with a coupled atmosphere-ocean model to assess the influence of different ozone climatologies on the atmospheric dynamics and in particular on the northern hemispheric winter warming. The ensemble experiments were perturbed by a single Tambora-like eruption. Larger meridional gradients in the lower stratospheric ozone favor the coupling of zonal wind anomalies between the stratosphere and the troposphere after the eruption. The associated sea level pressure, temperature, and precipitation patterns are more pronounced and the northern hemispheric winter warming is highly significant. Conversely, weaker meridional ozone gradients lead to a weaker response of the winter warming and the associated patterns. The differences in the number of stratosphere-troposphere coupling events between the ensembles experiments indicate a nonlinear response behavior of the dynamics with respect to the ozone and the volcanic forcing.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Clays and claystones are used as backfill and barrier materials in the design of waste repositories, because they act as hydraulic barriers and retain contaminants. Transport through such barriers occurs mainly by molecular diffusion. There is thus an interest to relate the diffusion properties of clays to their structural properties. In previous work, we have developed a concept for up-scaling pore-scale molecular diffusion coefficients using a grid-based model for the sample pore structure. Here we present an operational algorithm which can generate such model pore structures of polymineral materials. The obtained pore maps match the rock’s mineralogical components and its macroscopic properties such as porosity, grain and pore size distributions. Representative ensembles of grains in 2D or 3D are created by a lattice Monte Carlo (MC) method, which minimizes the interfacial energy of grains starting from an initial grain distribution. Pores are generated at grain boundaries and/or within grains. The method is general and allows to generate anisotropic structures with grains of approximately predetermined shapes, or with mixtures of different grain types. A specific focus of this study was on the simulation of clay-like materials. The generated clay pore maps were then used to derive upscaled effective diffusion coefficients for non-sorbing tracers using a homogenization technique. The large number of generated maps allowed to check the relations between micro-structural features of clays and their effective transport parameters, as is required to explain and extrapolate experimental diffusion results. As examples, we present a set of 2D and 3D simulations and investigated the effects of nanopores within particles (interlayer pores) and micropores between particles. Archie’s simple power law is followed in systems with only micropores. When nanopores are present, additional parameters are required; the data reveal that effective diffusion coefficients could be described by a sum of two power functions, related to the micro- and nanoporosity. We further used the model to investigate the relationships between particle orientation and effective transport properties of the sample.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

While analysis and interpretation of structural epileptogenic lesion is an essential task for the neuroradiologist in clinical practice, a substantial body of epilepsy research has shown that focal lesions influence brain areas beyond the epileptogenic lesion, across ensembles of functionally and anatomically connected brain areas. In this review article, we aim to provide an overview about altered network compositions in epilepsy, as measured with current advanced neuroimaging techniques to characterize the initiation and spread of epileptic activity in the brain with multimodal noninvasive imaging techniques. We focus on resting-state functional magnetic resonance imaging (MRI) and simultaneous electroencephalography/fMRI, and oppose the findings in idiopathic generalized versus focal epilepsies. These data indicate that circumscribed epileptogenic lesions can have extended effects on many brain systems. Although epileptic seizures may involve various brain areas, seizure activity does not spread diffusely throughout the brain but propagates along specific anatomic pathways that characterize the underlying epilepsy syndrome. Such a functionally oriented approach may help to better understand a range of clinical phenomena such as the type of cognitive impairment, the development of pharmacoresistance, the propagation pathways of seizures, or the success of epilepsy surgery.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A sustainable water resources management depends on sound information about the impacts of climate change. This information is, however, not easily derived because natural runoff variability interferes with the climate change signal. This study presents a procedure that leads to robust estimates of magnitude and Time Of Emergence (TOE) of climate-induced hydrological change that also account for the natural variability contained in the time series. Firstly, natural variability of 189 mesoscale catchments in Switzerland is sampled for 10 ENSEMBLES scenarios for the control (1984–2005) and two scenario periods (near future: 2025–2046, far future: 2074–2095) applying a bootstrap procedure. Then, the sampling distributions of mean monthly runoff are tested for significant differences with the Wilcoxon-Mann–Whitney test and for effect size with Cliff’s delta d. Finally, the TOE of a climate change induced hydrological change is determined when at least eight out of the ten hydrological projections significantly differ from natural variability. The results show that the TOE occurs in the near future period except for high-elevated catchments in late summer. The significant hydrological projections in the near future correspond, however, to only minor runoff changes. In the far future, hydrological change is statistically significant and runoff changes are substantial. Temperature change is the most important factor determining hydrological change in this mountainous region. Therefore, hydrological change depends strongly on a catchment’s mean elevation. Considering that the hydrological changes are predicted to be robust in the near future highlights the importance of accounting for these changes in water resources planning.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Gravity field parameters are usually determined from observations of the GRACE satellite mission together with arc-specific parameters in a generalized orbit determination process. When separating the estimation of gravity field parameters from the determination of the satellites’ orbits, correlations between orbit parameters and gravity field coefficients are ignored and the latter parameters are biased towards the a priori force model. We are thus confronted with a kind of hidden regularization. To decipher the underlying mechanisms, the Celestial Mechanics Approach is complemented by tools to modify the impact of the pseudo-stochastic arc-specific parameters on the normal equations level and to efficiently generate ensembles of solutions. By introducing a time variable a priori model and solving for hourly pseudo-stochastic accelerations, a significant reduction of noisy striping in the monthly solutions can be achieved. Setting up more frequent pseudo-stochastic parameters results in a further reduction of the noise, but also in a notable damping of the observed geophysical signals. To quantify the effect of the a priori model on the monthly solutions, the process of fixing the orbit parameters is replaced by an equivalent introduction of special pseudo-observations, i.e., by explicit regularization. The contribution of the thereby introduced a priori information is determined by a contribution analysis. The presented mechanism is valid universally. It may be used to separate any subset of parameters by pseudo-observations of a special design and to quantify the damage imposed on the solution.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Arctic sea ice cover declined over the last few decades and reached a record minimum in 2007, with a slight recovery thereafter. Inspired by this the authors investigate the response of atmospheric and oceanic properties to a 1-yr period of reduced sea ice cover. Two ensembles of equilibrium and transient simulations are produced with the Community Climate System Model. A sea ice change is induced through an albedo change of 1 yr. The sea ice area and thickness recover in both ensembles after 3 and 5 yr, respectively. The sea ice anomaly leads to changes in ocean temperature and salinity to a depth of about 200 m in the Arctic Basin. Further, the salinity and temperature changes in the surface layer trigger a “Great Salinity Anomaly” in the North Atlantic that takes roughly 8 yr to travel across the North Atlantic back to high latitudes. In the atmosphere the changes induced by the sea ice anomaly do not last as long as in the ocean. The response in the transient and equilibrium simulations, while similar overall, differs in specific regional and temporal details. The surface air temperature increases over the Arctic Basin and the anomaly extends through the whole atmospheric column, changing the geopotential height fields and thus the storm tracks. The patterns of warming and thus the position of the geopotential height changes vary in the two ensembles. While the equilibrium simulation shifts the storm tracks to the south over the eastern North Atlantic and Europe, the transient simulation shifts the storm tracks south over the western North Atlantic and North America. The authors propose that the overall reduction in sea ice cover is important for producing ocean anomalies; however, for atmospheric anomalies the regional location of the sea ice anomalies is more important. While observed trends in Arctic sea ice are large and exceed those simulated by comprehensive climate models, there is little evidence based on this particular model that the seasonal loss of sea ice (e.g., as occurred in 2007) would constitute a threshold after which the Arctic would exhibit nonlinear, irreversible, or strongly accelerated sea ice loss. Caution should be exerted when extrapolating short-term trends to future sea ice behavior.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Point Distribution Models (PDM) are among the most popular shape description techniques and their usefulness has been demonstrated in a wide variety of medical imaging applications. However, to adequately characterize the underlying modeled population it is essential to have a representative number of training samples, which is not always possible. This problem is especially relevant as the complexity of the modeled structure increases, being the modeling of ensembles of multiple 3D organs one of the most challenging cases. In this paper, we introduce a new GEneralized Multi-resolution PDM (GEM-PDM) in the context of multi-organ analysis able to efficiently characterize the different inter-object relations, as well as the particular locality of each object separately. Importantly, unlike previous approaches, the configuration of the algorithm is automated thanks to a new agglomerative landmark clustering method proposed here, which equally allows us to identify smaller anatomically significant regions within organs. The significant advantage of the GEM-PDM method over two previous approaches (PDM and hierarchical PDM) in terms of shape modeling accuracy and robustness to noise, has been successfully verified for two different databases of sets of multiple organs: six subcortical brain structures, and seven abdominal organs. Finally, we propose the integration of the new shape modeling framework into an active shape-model-based segmentation algorithm. The resulting algorithm, named GEMA, provides a better overall performance than the two classical approaches tested, ASM, and hierarchical ASM, when applied to the segmentation of 3D brain MRI.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We show how a test of macroscopic realism based on Leggett-Garg inequalities (LGIs) can be performed in a macroscopic system. Using a continuous-variable approach, we consider quantum nondemolition (QND) measurements applied to atomic ensembles undergoing magnetically driven coherent oscillation. We identify measurement schemes requiring only Gaussian states as inputs and giving a significant LGI violation with realistic experimental parameters and imperfections. The predicted violation is shown to be due to true quantum effects rather than to a classical invasivity of the measurement. Using QND measurements to tighten the “clumsiness loophole” forces the stubborn macrorealist to recreate quantum backaction in his or her account of measurement.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND Tight spatio-temporal signaling of cytoskeletal and adhesion dynamics is required for localized membrane protrusion that drives directed cell migration. Different ensembles of proteins are therefore likely to get recruited and phosphorylated in membrane protrusions in response to specific cues. RESULTS HERE, WE USE AN ASSAY THAT ALLOWS TO BIOCHEMICALLY PURIFY EXTENDING PROTRUSIONS OF CELLS MIGRATING IN RESPONSE TO THREE PROTOTYPICAL RECEPTORS: integrins, recepor tyrosine kinases and G-coupled protein receptors. Using quantitative proteomics and phospho-proteomics approaches, we provide evidence for the existence of cue-specific, spatially distinct protein networks in the different cell migration modes. CONCLUSIONS The integrated analysis of the large-scale experimental data with protein information from databases allows us to understand some emergent properties of spatial regulation of signaling during cell migration. This provides the cell migration community with a large-scale view of the distribution of proteins and phospho-proteins regulating directed cell migration.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The use of laser beams as excitation sources for the characterization of semiconductor nanowires (NWs) is largely extended. Raman spectroscopy and photoluminescence (PL) are currently applied to the study of NWs. However, NWs are systems with poor thermal conductivity and poor heat dissipation, which result in unintentional heating under the excitation with a focused laser beam with microscopic size, as those usually used in microRaman and microPL experiments. On the other hand, the NWs have subwavelength diameter, which changes the optical absorption with respect to the absorption in bulk materials. Furthermore, the NW diameter is smaller than the laser beam spot, which means that the optical power absorbed by the NW depends on its position inside the laser beam spot. A detailed analysis of the interaction between a microscopic focused laser beam and semiconductor NWs is necessary for the understanding of the experiments involving laser beam excitation of NWs. We present in this work a numerical analysis of the thermal transport in Si NWs, where the heat source is the laser energy locally absorbed by the NW. This analysis takes account of the optical absorption, the thermal conductivity, the dimensions, diameter and length of the NWs, and the immersion medium. Both free standing and heat-sunk NWs are considered. Also, the temperature distribution in ensembles of NWs is discussed. This analysis intends to constitute a tool for the understanding of the thermal phenomena induced by laser beams in semiconductor NWs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

It is well known that winter chilling is necessary for the flowering of temperate trees. The chilling requirement is a criterion for choosing a species or variety at a given location. Also chemistry products can be used for reducing the chilling-hours needs but make our production more expensive. This study first analysed the observed values of chilling hours for some representative agricultural locations in Spain for the last three decades and their projected changes under climate change scenarios. Usually the chilling is measured and calculated as chilling-hours, and different methods have been used to calculate them (e.g. Richarson et al., 1974 among others) according to the species considered. For our objective North Carolina method (Shaltout and Unrath, 1983) was applied for apples, Utah method (Richardson et al. 1974) for peach and grapevine and the approach used by De Melo-Abreu et al. (2004) for olive trees. The influence of climate change in temperate trees was studied by calculating projections of chilling-hours with climate data from Regional Climate Models (RCMs) at high resolution (25 km) from the European Project ENSEMBLES (http://www.ensembles-eu.org/). These projections will allow for analysing the modelled variations of chill-hours between 2nd half of 20C and 1st half of 21C at the study locations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Esta tesis se centra en el estudio de medios granulares blandos y atascados mediante la aplicación de la física estadística. Esta aproximación se sitúa entre los tradicionales enfoques macro y micromecánicos: trata de establecer cuáles son las propiedades macroscópicas esperables de un sistema granular en base a un análisis de las propiedades de las partículas y las interacciones que se producen entre ellas y a una consideración de las restricciones macroscópicas del sistema. Para ello se utiliza la teoría estadística junto con algunos principios, conceptos y definiciones de la teoría de los medios continuos (campo de tensiones y deformaciones, energía potencial elástica, etc) y algunas técnicas de homogeneización. La interacción entre las partículas es analizada mediante las aportaciones de la teoría del contacto y de las fuerzas capilares (producidas por eventuales meniscos de líquido cuando el medio está húmedo). La idea básica de la mecánica estadística es que entre todas soluciones de un problema físico (como puede ser el ensamblaje en equilibrio estático de partículas de un medio granular) existe un conjunto que es compatible con el conocimiento macroscópico que tenemos del sistema (por ejemplo, su volumen, la tensión a la que está sometido, la energía potencial elástica que almacena, etc.). Este conjunto todavía contiene un número enorme de soluciones. Pues bien, si no hay ninguna información adicional es razonable pensar que no existe ningún motivo para que alguna de estas soluciones sea más probable que las demás. Entonces parece natural asignarles a todas ellas el mismo peso estadístico y construir una función matemática compatible. Actuando de este modo se obtiene cuál es la función de distribución más probable de algunas cantidades asociadas a las soluciones, para lo cual es muy importante asegurarse de que todas ellas son igualmente accesibles por el procedimiento de ensamblaje o protocolo. Este enfoque se desarrolló en sus orígenes para el estudio de los gases ideales pero se puede extender para sistemas no térmicos como los analizados en esta tesis. En este sentido el primer intento se produjo hace poco más de veinte años y es la colectividad de volumen. Desde entonces esta ha sido empleada y mejorada por muchos investigadores en todo el mundo, mientras que han surgido otras, como la de la energía o la del fuerza-momento (tensión multiplicada por volumen). Cada colectividad describe, en definitiva, conjuntos de soluciones caracterizados por diferentes restricciones macroscópicas, pero de todos ellos resultan distribuciones estadísticas de tipo Maxwell-Boltzmann y controladas por dichas restricciones. En base a estos trabajos previos, en esta tesis se ha adaptado el enfoque clásico de la física estadística para el caso de medios granulares blandos. Se ha propuesto un marco general para estudiar estas colectividades que se basa en la comparación de todas las posibles soluciones en un espacio matemático definido por las componentes del fuerza-momento y en unas funciones de densidad de estados. Este desarrollo teórico se complementa con resultados obtenidos mediante simulación de la compresión cíclica de sistemas granulares bidimensionales. Se utilizó para ello un método de dinámica molecular, MD (o DEM). Las simulaciones consideran una interacción mecánica elástica, lineal y amortiguada a la que se ha añadido, en algunos casos, la fuerza cohesiva producida por meniscos de agua. Se realizaron cálculos en serie y en paralelo. Los resultados no solo prueban que las funciones de distribución de las componentes de fuerza-momento del sistema sometido a un protocolo específico parecen ser universales, sino que también revelan que existen muchos aspectos computacionales que pueden determinar cuáles son las soluciones accesibles. This thesis focuses on the application of statistical mechanics for the study of static and jammed packings of soft granular media. Such approach lies between micro and macromechanics: it tries to establish what the expected macroscopic properties of a granular system are, by starting from a micromechanical analysis of the features of the particles, and the interactions between them, and by considering the macroscopic constraints of the system. To do that, statistics together with some principles, concepts and definitions of continuum mechanics (e.g. stress and strain fields, elastic potential energy, etc.) as well as some homogenization techniques are used. The interaction between the particles of a granular system is examined too and theories on contact and capillary forces (when the media are wet) are revisited. The basic idea of statistical mechanics is that among the solutions of a physical problem (e.g. the static arrangement of particles in mechanical equilibrium) there is a class that is compatible with our macroscopic knowledge of the system (volume, stress, elastic potential energy,...). This class still contains an enormous number of solutions. In the absence of further information there is not any a priori reason for favoring one of these more than any other. Hence we shall naturally construct the equilibrium function by assigning equal statistical weights to all the functions compatible with our requirements. This procedure leads to the most probable statistical distribution of some quantities, but it is necessary to guarantee that all the solutions are likely accessed. This approach was originally set up for the study of ideal gases, but it can be extended to non-thermal systems too. In this connection, the first attempt for granular systems was the volume ensemble, developed about 20 years ago. Since then, this model has been followed and improved upon by many researchers around the world, while other two approaches have also been set up: energy and force-moment (i.e. stress multiplied by volume) ensembles. Each ensemble is described by different macroscopic constraints but all of them result on a Maxwell-Boltzmann statistical distribution, which is precisely controlled by the respective constraints. According to this previous work, in this thesis the classical statistical mechanics approach is introduced and adapted to the case of soft granular media. A general framework, which includes these three ensembles and uses a force-moment phase space and a density of states function, is proposed. This theoretical development is complemented by molecular dynamics (or DEM) simulations of the cyclic compression of 2D granular systems. Simulations were carried out by considering spring-dashpot mechanical interactions and attractive capillary forces in some cases. They were run on single and parallel processors. Results not only prove that the statistical distributions of the force-moment components obtained with a specific protocol seem to be universal, but also that there are many computational issues that can determine what the attained packings or solutions are.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Analysis of big amount of data is a field with many years of research. It is centred in getting significant values, to make it easier to understand and interpret data. Being the analysis of interdependence between time series an important field of research, mainly as a result of advances in the characterization of dynamical systems from the signals they produce. In the medicine sphere, it is easy to find many researches that try to understand the brain behaviour, its operation mode and its internal connections. The human brain comprises approximately 1011 neurons, each of which makes about 103 synaptic connections. This huge number of connections between individual processing elements provides the fundamental substrate for neuronal ensembles to become transiently synchronized or functionally connected. A similar complex network configuration and dynamics can also be found at the macroscopic scales of systems neuroscience and brain imaging. The emergence of dynamically coupled cell assemblies represents the neurophysiological substrate for cognitive function such as perception, learning, thinking. Understanding the complex network organization of the brain on the basis of neuroimaging data represents one of the most impervious challenges for systems neuroscience. Brain connectivity is an elusive concept that refers to diferent interrelated aspects of brain organization: structural, functional connectivity (FC) and efective connectivity (EC). Structural connectivity refers to a network of physical connections linking sets of neurons, it is the anatomical structur of brain networks. However, FC refers to the statistical dependence between the signals stemming from two distinct units within a nervous system, while EC refers to the causal interactions between them. This research opens the door to try to resolve diseases related with the brain, like Parkinson’s disease, senile dementia, mild cognitive impairment, etc. One of the most important project associated with Alzheimer’s research and other diseases are enclosed in the European project called Blue Brain. The center for Biomedical Technology (CTB) of Universidad Politecnica de Madrid (UPM) forms part of the project. The CTB researches have developed a magnetoencephalography (MEG) data processing tool that allow to visualise and analyse data in an intuitive way. This tool receives the name of HERMES, and it is presented in this document. Analysis of big amount of data is a field with many years of research. It is centred in getting significant values, to make it easier to understand and interpret data. Being the analysis of interdependence between time series an important field of research, mainly as a result of advances in the characterization of dynamical systems from the signals they produce. In the medicine sphere, it is easy to find many researches that try to understand the brain behaviour, its operation mode and its internal connections. The human brain comprises approximately 1011 neurons, each of which makes about 103 synaptic connections. This huge number of connections between individual processing elements provides the fundamental substrate for neuronal ensembles to become transiently synchronized or functionally connected. A similar complex network configuration and dynamics can also be found at the macroscopic scales of systems neuroscience and brain imaging. The emergence of dynamically coupled cell assemblies represents the neurophysiological substrate for cognitive function such as perception, learning, thinking. Understanding the complex network organization of the brain on the basis of neuroimaging data represents one of the most impervious challenges for systems neuroscience. Brain connectivity is an elusive concept that refers to diferent interrelated aspects of brain organization: structural, functional connectivity (FC) and efective connectivity (EC). Structural connectivity refers to a network of physical connections linking sets of neurons, it is the anatomical structur of brain networks. However, FC refers to the statistical dependence between the signals stemming from two distinct units within a nervous system, while EC refers to the causal interactions between them. This research opens the door to try to resolve diseases related with the brain, like Parkinson’s disease, senile dementia, mild cognitive impairment, etc. One of the most important project associated with Alzheimer’s research and other diseases are enclosed in the European project called Blue Brain. The center for Biomedical Technology (CTB) of Universidad Politecnica de Madrid (UPM) forms part of the project. The CTB researches have developed a magnetoencephalography (MEG) data processing tool that allow to visualise and analyse data in an intuitive way. This tool receives the name of HERMES, and it is presented in this document.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Nuestro cerebro contiene cerca de 1014 sinapsis neuronales. Esta enorme cantidad de conexiones proporciona un entorno ideal donde distintos grupos de neuronas se sincronizan transitoriamente para provocar la aparición de funciones cognitivas, como la percepción, el aprendizaje o el pensamiento. Comprender la organización de esta compleja red cerebral en base a datos neurofisiológicos, representa uno de los desafíos más importantes y emocionantes en el campo de la neurociencia. Se han propuesto recientemente varias medidas para evaluar cómo se comunican las diferentes partes del cerebro a diversas escalas (células individuales, columnas corticales, o áreas cerebrales). Podemos clasificarlos, según su simetría, en dos grupos: por una parte, la medidas simétricas, como la correlación, la coherencia o la sincronización de fase, que evalúan la conectividad funcional (FC); mientras que las medidas asimétricas, como la causalidad de Granger o transferencia de entropía, son capaces de detectar la dirección de la interacción, lo que denominamos conectividad efectiva (EC). En la neurociencia moderna ha aumentado el interés por el estudio de las redes funcionales cerebrales, en gran medida debido a la aparición de estos nuevos algoritmos que permiten analizar la interdependencia entre señales temporales, además de la emergente teoría de redes complejas y la introducción de técnicas novedosas, como la magnetoencefalografía (MEG), para registrar datos neurofisiológicos con gran resolución. Sin embargo, nos hallamos ante un campo novedoso que presenta aun varias cuestiones metodológicas sin resolver, algunas de las cuales trataran de abordarse en esta tesis. En primer lugar, el creciente número de aproximaciones para determinar la existencia de FC/EC entre dos o más señales temporales, junto con la complejidad matemática de las herramientas de análisis, hacen deseable organizarlas todas en un paquete software intuitivo y fácil de usar. Aquí presento HERMES (http://hermes.ctb.upm.es), una toolbox en MatlabR, diseñada precisamente con este fin. Creo que esta herramienta será de gran ayuda para todos aquellos investigadores que trabajen en el campo emergente del análisis de conectividad cerebral y supondrá un gran valor para la comunidad científica. La segunda cuestión practica que se aborda es el estudio de la sensibilidad a las fuentes cerebrales profundas a través de dos tipos de sensores MEG: gradiómetros planares y magnetómetros, esta aproximación además se combina con un enfoque metodológico, utilizando dos índices de sincronización de fase: phase locking value (PLV) y phase lag index (PLI), este ultimo menos sensible a efecto la conducción volumen. Por lo tanto, se compara su comportamiento al estudiar las redes cerebrales, obteniendo que magnetómetros y PLV presentan, respectivamente, redes más densamente conectadas que gradiómetros planares y PLI, por los valores artificiales que crea el problema de la conducción de volumen. Sin embargo, cuando se trata de caracterizar redes epilépticas, el PLV ofrece mejores resultados, debido a la gran dispersión de las redes obtenidas con PLI. El análisis de redes complejas ha proporcionado nuevos conceptos que mejoran caracterización de la interacción de sistemas dinámicos. Se considera que una red está compuesta por nodos, que simbolizan sistemas, cuyas interacciones se representan por enlaces, y su comportamiento y topología puede caracterizarse por un elevado número de medidas. Existe evidencia teórica y empírica de que muchas de ellas están fuertemente correlacionadas entre sí. Por lo tanto, se ha conseguido seleccionar un pequeño grupo que caracteriza eficazmente estas redes, y condensa la información redundante. Para el análisis de redes funcionales, la selección de un umbral adecuado para decidir si un determinado valor de conectividad de la matriz de FC es significativo y debe ser incluido para un análisis posterior, se convierte en un paso crucial. En esta tesis, se han obtenido resultados más precisos al utilizar un test de subrogadas, basado en los datos, para evaluar individualmente cada uno de los enlaces, que al establecer a priori un umbral fijo para la densidad de conexiones. Finalmente, todas estas cuestiones se han aplicado al estudio de la epilepsia, caso práctico en el que se analizan las redes funcionales MEG, en estado de reposo, de dos grupos de pacientes epilépticos (generalizada idiopática y focal frontal) en comparación con sujetos control sanos. La epilepsia es uno de los trastornos neurológicos más comunes, con más de 55 millones de afectados en el mundo. Esta enfermedad se caracteriza por la predisposición a generar ataques epilépticos de actividad neuronal anormal y excesiva o bien síncrona, y por tanto, es el escenario perfecto para este tipo de análisis al tiempo que presenta un gran interés tanto desde el punto de vista clínico como de investigación. Los resultados manifiestan alteraciones especificas en la conectividad y un cambio en la topología de las redes en cerebros epilépticos, desplazando la importancia del ‘foco’ a la ‘red’, enfoque que va adquiriendo relevancia en las investigaciones recientes sobre epilepsia. ABSTRACT There are about 1014 neuronal synapses in the human brain. This huge number of connections provides the substrate for neuronal ensembles to become transiently synchronized, producing the emergence of cognitive functions such as perception, learning or thinking. Understanding the complex brain network organization on the basis of neuroimaging data represents one of the most important and exciting challenges for systems neuroscience. Several measures have been recently proposed to evaluate at various scales (single cells, cortical columns, or brain areas) how the different parts of the brain communicate. We can classify them, according to their symmetry, into two groups: symmetric measures, such as correlation, coherence or phase synchronization indexes, evaluate functional connectivity (FC); and on the other hand, the asymmetric ones, such as Granger causality or transfer entropy, are able to detect effective connectivity (EC) revealing the direction of the interaction. In modern neurosciences, the interest in functional brain networks has increased strongly with the onset of new algorithms to study interdependence between time series, the advent of modern complex network theory and the introduction of powerful techniques to record neurophysiological data, such as magnetoencephalography (MEG). However, when analyzing neurophysiological data with this approach several questions arise. In this thesis, I intend to tackle some of the practical open problems in the field. First of all, the increase in the number of time series analysis algorithms to study brain FC/EC, along with their mathematical complexity, creates the necessity of arranging them into a single, unified toolbox that allow neuroscientists, neurophysiologists and researchers from related fields to easily access and make use of them. I developed such a toolbox for this aim, it is named HERMES (http://hermes.ctb.upm.es), and encompasses several of the most common indexes for the assessment of FC and EC running for MatlabR environment. I believe that this toolbox will be very helpful to all the researchers working in the emerging field of brain connectivity analysis and will entail a great value for the scientific community. The second important practical issue tackled in this thesis is the evaluation of the sensitivity to deep brain sources of two different MEG sensors: planar gradiometers and magnetometers, in combination with the related methodological approach, using two phase synchronization indexes: phase locking value (PLV) y phase lag index (PLI), the latter one being less sensitive to volume conduction effect. Thus, I compared their performance when studying brain networks, obtaining that magnetometer sensors and PLV presented higher artificial values as compared with planar gradiometers and PLI respectively. However, when it came to characterize epileptic networks it was the PLV which gives better results, as PLI FC networks where very sparse. Complex network analysis has provided new concepts which improved characterization of interacting dynamical systems. With this background, networks could be considered composed of nodes, symbolizing systems, whose interactions with each other are represented by edges. A growing number of network measures is been applied in network analysis. However, there is theoretical and empirical evidence that many of these indexes are strongly correlated with each other. Therefore, in this thesis I reduced them to a small set, which could more efficiently characterize networks. Within this framework, selecting an appropriate threshold to decide whether a certain connectivity value of the FC matrix is significant and should be included in the network analysis becomes a crucial step, in this thesis, I used the surrogate data tests to make an individual data-driven evaluation of each of the edges significance and confirmed more accurate results than when just setting to a fixed value the density of connections. All these methodologies were applied to the study of epilepsy, analysing resting state MEG functional networks, in two groups of epileptic patients (generalized and focal epilepsy) that were compared to matching control subjects. Epilepsy is one of the most common neurological disorders, with more than 55 million people affected worldwide, characterized by its predisposition to generate epileptic seizures of abnormal excessive or synchronous neuronal activity, and thus, this scenario and analysis, present a great interest from both the clinical and the research perspective. Results revealed specific disruptions in connectivity and network topology and evidenced that networks’ topology is changed in epileptic brains, supporting the shift from ‘focus’ to ‘networks’ which is gaining importance in modern epilepsy research.