951 resultados para Complex networks. Magnetic system. Metropolis
Resumo:
The School of Industrial Engineering at Universidad Politécnica de Madrid (ETSII-UPM) has been promoting student-centred teaching-learning activities, according to the aims of the Bologna Declaration, well before the official establishment of the European Area of Higher Education. Such student-centred teaching-learning experiences led us to the conviction that project based learning is rewarding, both for students and academics, and should be additionally promoted in our new engineering programmes, adapted to the Grade-Master structure. The level of commitment of our teachers with these activities is noteworthy, as the teaching innovation experiences carried out in the last ten years have led to the foundation of 17 Teaching Innovation Groups at ETSII-UPM, hence leading the ranking of teaching innovation among all UPM centres. Among interesting CDIO activities our students have taken part in especially complex projects, including the Formula Student, linked to the complete development of a competition car, and the Cybertech competition, aimed at the design, construction and operation of robots for different purposes. Additional project-based learning teamwork activities have been linked to toy design, to the development of medical devices, to the implementation of virtual laboratories, to the design of complete industrial installations and factories, among other activities detailed in present study. The implementation of Bologna process will culminate at ETSII-UPM with the beginning of the Master’s Degree in Industrial Engineering, in academic year 2014-15. The program has been successfully approved by the Spanish Agency for Accreditation (ANECA), with the inclusion of a set of subjects based upon the CDIO methodology denominated generally “INGENIA”, linked to the Spanish “ingeniar” (to provide ingenious solutions), also related etymologically in Spanish with “ingeniero”, engineer. INGENIA students will live through the complete development process of a complex product or system and there will be different kind of projects covering most of the engineering majors at ETSII-UPM.
Resumo:
In the last years, there has been an increase in the amount of real-time data generated. Sensors attached to things are transforming how we interact with our environment. Extracting meaningful information from these streams of data is essential for some application areas and requires processing systems that scale to varying conditions in data sources, complex queries, and system failures. This paper describes ongoing research on the development of a scalable RDF streaming engine.
Resumo:
Los hipergrafos dirigidos se han empleado en problemas relacionados con lógica proposicional, bases de datos relacionales, linguística computacional y aprendizaje automático. Los hipergrafos dirigidos han sido también utilizados como alternativa a los grafos (bipartitos) dirigidos para facilitar el estudio de las interacciones entre componentes de sistemas complejos que no pueden ser fácilmente modelados usando exclusivamente relaciones binarias. En este contexto, este tipo de representación es conocida como hiper-redes. Un hipergrafo dirigido es una generalización de un grafo dirigido especialmente adecuado para la representación de relaciones de muchos a muchos. Mientras que una arista en un grafo dirigido define una relación entre dos de sus nodos, una hiperarista en un hipergrafo dirigido define una relación entre dos conjuntos de sus nodos. La conexión fuerte es una relación de equivalencia que divide el conjunto de nodos de un hipergrafo dirigido en particiones y cada partición define una clase de equivalencia conocida como componente fuertemente conexo. El estudio de los componentes fuertemente conexos de un hipergrafo dirigido puede ayudar a conseguir una mejor comprensión de la estructura de este tipo de hipergrafos cuando su tamaño es considerable. En el caso de grafo dirigidos, existen algoritmos muy eficientes para el cálculo de los componentes fuertemente conexos en grafos de gran tamaño. Gracias a estos algoritmos, se ha podido averiguar que la estructura de la WWW tiene forma de “pajarita”, donde más del 70% del los nodos están distribuidos en tres grandes conjuntos y uno de ellos es un componente fuertemente conexo. Este tipo de estructura ha sido también observada en redes complejas en otras áreas como la biología. Estudios de naturaleza similar no han podido ser realizados en hipergrafos dirigidos porque no existe algoritmos capaces de calcular los componentes fuertemente conexos de este tipo de hipergrafos. En esta tesis doctoral, hemos investigado como calcular los componentes fuertemente conexos de un hipergrafo dirigido. En concreto, hemos desarrollado dos algoritmos para este problema y hemos determinado que son correctos y cuál es su complejidad computacional. Ambos algoritmos han sido evaluados empíricamente para comparar sus tiempos de ejecución. Para la evaluación, hemos producido una selección de hipergrafos dirigidos generados de forma aleatoria inspirados en modelos muy conocidos de grafos aleatorios como Erdos-Renyi, Newman-Watts-Strogatz and Barabasi-Albert. Varias optimizaciones para ambos algoritmos han sido implementadas y analizadas en la tesis. En concreto, colapsar los componentes fuertemente conexos del grafo dirigido que se puede construir eliminando ciertas hiperaristas complejas del hipergrafo dirigido original, mejora notablemente los tiempos de ejecucion de los algoritmos para varios de los hipergrafos utilizados en la evaluación. Aparte de los ejemplos de aplicación mencionados anteriormente, los hipergrafos dirigidos han sido también empleados en el área de representación de conocimiento. En concreto, este tipo de hipergrafos se han usado para el cálculo de módulos de ontologías. Una ontología puede ser definida como un conjunto de axiomas que especifican formalmente un conjunto de símbolos y sus relaciones, mientras que un modulo puede ser entendido como un subconjunto de axiomas de la ontología que recoge todo el conocimiento que almacena la ontología sobre un conjunto especifico de símbolos y sus relaciones. En la tesis nos hemos centrado solamente en módulos que han sido calculados usando la técnica de localidad sintáctica. Debido a que las ontologías pueden ser muy grandes, el cálculo de módulos puede facilitar las tareas de re-utilización y mantenimiento de dichas ontologías. Sin embargo, analizar todos los posibles módulos de una ontología es, en general, muy costoso porque el numero de módulos crece de forma exponencial con respecto al número de símbolos y de axiomas de la ontología. Afortunadamente, los axiomas de una ontología pueden ser divididos en particiones conocidas como átomos. Cada átomo representa un conjunto máximo de axiomas que siempre aparecen juntos en un modulo. La decomposición atómica de una ontología es definida como un grafo dirigido de tal forma que cada nodo del grafo corresponde con un átomo y cada arista define una dependencia entre una pareja de átomos. En esta tesis introducimos el concepto de“axiom dependency hypergraph” que generaliza el concepto de descomposición atómica de una ontología. Un modulo en una ontología correspondería con un componente conexo en este tipo de hipergrafos y un átomo de una ontología con un componente fuertemente conexo. Hemos adaptado la implementación de nuestros algoritmos para que funcionen también con axiom dependency hypergraphs y poder de esa forma calcular los átomos de una ontología. Para demostrar la viabilidad de esta idea, hemos incorporado nuestros algoritmos en una aplicación que hemos desarrollado para la extracción de módulos y la descomposición atómica de ontologías. A la aplicación la hemos llamado HyS y hemos estudiado sus tiempos de ejecución usando una selección de ontologías muy conocidas del área biomédica, la mayoría disponibles en el portal de Internet NCBO. Los resultados de la evaluación muestran que los tiempos de ejecución de HyS son mucho mejores que las aplicaciones más rápidas conocidas. ABSTRACT Directed hypergraphs are an intuitive modelling formalism that have been used in problems related to propositional logic, relational databases, computational linguistic and machine learning. Directed hypergraphs are also presented as an alternative to directed (bipartite) graphs to facilitate the study of the interactions between components of complex systems that cannot naturally be modelled as binary relations. In this context, they are known as hyper-networks. A directed hypergraph is a generalization of a directed graph suitable for representing many-to-many relationships. While an edge in a directed graph defines a relation between two nodes of the graph, a hyperedge in a directed hypergraph defines a relation between two sets of nodes. Strong-connectivity is an equivalence relation that induces a partition of the set of nodes of a directed hypergraph into strongly-connected components. These components can be collapsed into single nodes. As result, the size of the original hypergraph can significantly be reduced if the strongly-connected components have many nodes. This approach might contribute to better understand how the nodes of a hypergraph are connected, in particular when the hypergraphs are large. In the case of directed graphs, there are efficient algorithms that can be used to compute the strongly-connected components of large graphs. For instance, it has been shown that the macroscopic structure of the World Wide Web can be represented as a “bow-tie” diagram where more than 70% of the nodes are distributed into three large sets and one of these sets is a large strongly-connected component. This particular structure has been also observed in complex networks in other fields such as, e.g., biology. Similar studies cannot be conducted in a directed hypergraph because there does not exist any algorithm for computing the strongly-connected components of the hypergraph. In this thesis, we investigate ways to compute the strongly-connected components of directed hypergraphs. We present two new algorithms and we show their correctness and computational complexity. One of these algorithms is inspired by Tarjan’s algorithm for directed graphs. The second algorithm follows a simple approach to compute the stronglyconnected components. This approach is based on the fact that two nodes of a graph that are strongly-connected can also reach the same nodes. In other words, the connected component of each node is the same. Both algorithms are empirically evaluated to compare their performances. To this end, we have produced a selection of random directed hypergraphs inspired by existent and well-known random graphs models like Erd˝os-Renyi and Newman-Watts-Strogatz. Besides the application examples that we mentioned earlier, directed hypergraphs have also been employed in the field of knowledge representation. In particular, they have been used to compute the modules of an ontology. An ontology is defined as a collection of axioms that provides a formal specification of a set of terms and their relationships; and a module is a subset of an ontology that completely captures the meaning of certain terms as defined in the ontology. In particular, we focus on the modules computed using the notion of syntactic locality. As ontologies can be very large, the computation of modules facilitates the reuse and maintenance of these ontologies. Analysing all modules of an ontology, however, is in general not feasible as the number of modules grows exponentially in the number of terms and axioms of the ontology. Nevertheless, the modules can succinctly be represented using the Atomic Decomposition of an ontology. Using this representation, an ontology can be partitioned into atoms, which are maximal sets of axioms that co-occur in every module. The Atomic Decomposition is then defined as a directed graph such that each node correspond to an atom and each edge represents a dependency relation between two atoms. In this thesis, we introduce the notion of an axiom dependency hypergraph which is a generalization of the atomic decomposition of an ontology. A module in the ontology corresponds to a connected component in the hypergraph, and the atoms of the ontology to the strongly-connected components. We apply our algorithms for directed hypergraphs to axiom dependency hypergraphs and in this manner, we compute the atoms of an ontology. To demonstrate the viability of this approach, we have implemented the algorithms in the application HyS which computes the modules of ontologies and calculate their atomic decomposition. In the thesis, we provide an experimental evaluation of HyS with a selection of large and prominent biomedical ontologies, most of which are available in the NCBO Bioportal. HyS outperforms state-of-the-art implementations in the tasks of extracting modules and computing the atomic decomposition of these ontologies.
Resumo:
The activation of cyclin-dependent kinases (cdks) has been implicated in apoptosis induced by various stimuli. We find that the Fas-induced activation of cdc2 and cdk2 in Jurkat cells is not dependent on protein synthesis, which is shut down very early during apoptosis before caspase-3 activation. Instead, activation of these kinases seems to result from both a rapid cleavage of Wee1 (an inhibitory kinase of cdc2 and cdk2) and inactivation of anaphase-promoting complex (the specific system for cyclin degradation), in which CDC27 homolog is cleaved during apoptosis. Both Wee1 and CDC27 are shown to be substrates of the caspase-3-like protease. Although cdk activities are elevated during Fas-induced apoptosis in Jurkat cells, general activation of the mitotic processes does not occur. Our results do not support the idea that apoptosis is simply an aberrant mitosis but, instead, suggest that a subset of mitotic mechanisms plays an important role in apoptosis through elevated cdk activities.
Resumo:
Intracellular transport is essential for morphogenesis and functioning of the cell. The kinesin superfamily proteins (KIFs) have been shown to transport membranous organelles and protein complexes in a microtubule- and ATP-dependent manner. More than 30 KIFs have been reported in mice. However, the nomenclature of KIFs has not been clearly established, resulting in various designations and redundant names for a single KIF. Here, we report the identification and classification of all KIFs in mouse and human genome transcripts. Previously unidentified murine KIFs were found by a PCR-based search. The identification of all KIFs was confirmed by a database search of the total human genome. As a result, there are a total of 45 KIFs. The nomenclature of all KIFs is presented. To understand the function of KIFs in intracellular transport in a single tissue, we focused on the brain. The expression of 38 KIFs was detected in brain tissue by Northern blotting or PCR using cDNA. The brain, mainly composed of highly differentiated and polarized cells such as neurons and glia, requires a highly complex intracellular transport system as indicated by the increased number of KIFs for their sophisticated functions. It is becoming increasingly clear that the cell uses a number of KIFs and tightly controls the direction, destination, and velocity of transportation of various important functional molecules, including mRNA. This report will set the foundation of KIF and intracellular transport research.
Resumo:
We propose cotunneling as the microscopic mechanism that makes possible inelastic electron tunneling spectroscopy of magnetic atoms in surfaces for a wide range of systems, including single magnetic adatoms, molecules, and molecular stacks. We describe electronic transport between the scanning tip and the conducting surface through the magnetic system (MS) with a generalized Anderson model, without making use of effective spin models. Transport and spin dynamics are described with an effective cotunneling Hamiltonian in which the correlations in the magnetic system are calculated exactly and the coupling to the electrodes is included up to second order in the tip MS and MS substrate. In the adequate limit our approach is equivalent to the phenomenological Kondo exchange model that successfully describes the experiments. We apply our method to study in detail inelastic transport in two systems, stacks of cobalt phthalocyanines and a single Mn atom on Cu2N. Our method accounts for both the large contribution of the inelastic spin exchange events to the conductance and the observed conductance asymmetry.
Resumo:
Digital magnetic recording is based on the storage of a bit of information in the orientation of a magnetic system with two stable ground states. Here we address two fundamental problems that arise when this is done on a quantized spin: quantum spin tunneling and backaction of the readout process. We show that fundamental differences exist between integer and semi-integer spins when it comes to both reading and recording classical information in a quantized spin. Our findings imply fundamental limits to the miniaturization of magnetic bits and are relevant to recent experiments where a spin-polarized scanning tunneling microscope reads and records a classical bit in the spin orientation of a single magnetic atom.
Resumo:
The analysis of clusters has attracted considerable interest over the last few decades. The articulation of clusters into complex networks and systems of innovation -- generally known as regional innovation systems -- has, in particular, been associated with the delivery of greater innovation and growth. However, despite the growing economic and policy relevance of clusters, little systematic research has been conducted into their association with other factors promoting innovation and economic growth. This article addresses this issue by looking at the relationship between innovation and economic growth in 152 regions of Europe during the period between 1995 and 2006. Using an econometric model with a static and a dynamic dimension, the results of the analysis highlight that: a) regional growth through innovation in Europe is fundamentally connected to the presence of an adequate socioeconomic environment and, in particular, to the existence of a well-trained and educated pool of workers; b) the presence of clusters matters for regional growth, but only in combination with a good ‘social filter’, and this association wanes in time; c) more traditional R&D variables have a weak initial connection to economic development, but this connection increases over time and, is, once again, contingent on the existence of adequate socioeconomic conditions.
Resumo:
Consider a network of unreliable links, modelling for example a communication network. Estimating the reliability of the network-expressed as the probability that certain nodes in the network are connected-is a computationally difficult task. In this paper we study how the Cross-Entropy method can be used to obtain more efficient network reliability estimation procedures. Three techniques of estimation are considered: Crude Monte Carlo and the more sophisticated Permutation Monte Carlo and Merge Process. We show that the Cross-Entropy method yields a speed-up over all three techniques.
Resumo:
In Australia more than 300 vertebrates, including 43 insectivorous bat species, depend on hollows in habitat trees for shelter, with many species using a network of multiple trees as roosts, We used roost-switching data on white-striped freetail bats (Tadarida australis; Microchiroptera: Molossidae) to construct a network representation of day roosts in suburban Brisbane, Australia. Bats were caught from a communal roost tree with a roosting group of several hundred individuals and released with transmitters. Each roost used by the bats represented a node in the network, and the movements of bats between roosts formed the links between nodes. Despite differences in gender and reproductive stages, the bats exhibited the same behavior throughout three radiotelemetry periods and over 500 bat days of radio tracking: each roosted in separate roosts, switched roosts very infrequently, and associated with other bats only at the communal roost This network resembled a scale-free network in which the distribution of the number of links from each roost followed a power law. Despite being spread over a large geographic area (> 200 km(2)), each roost was connected to others by less than three links. One roost (the hub or communal roost) defined the architecture of the network because it had the most links. That the network showed scale-free properties has profound implications for the management of the habitat trees of this roosting group. Scale-free networks provide high tolerance against stochastic events such as random roost removals but are susceptible to the selective removal of hub nodes. Network analysis is a useful tool for understanding the structural organization of habitat tree usage and allows the informed judgment of the relative importance of individual trees and hence the derivation of appropriate management decisions, Conservation planners and managers should emphasize the differential importance of habitat trees and think of them as being analogous to vital service centers in human societies.
Resumo:
Topological measures of large-scale complex networks are applied to a specific artificial regulatory network model created through a whole genome duplication and divergence mechanism. This class of networks share topological features with natural transcriptional regulatory networks. Specifically, these networks display scale-free and small-world topology and possess subgraph distributions similar to those of natural networks. Thus, the topologies inherent in natural networks may be in part due to their method of creation rather than being exclusively shaped by subsequent evolution under selection. The evolvability of the dynamics of these networks is also examined by evolving networks in simulation to obtain three simple types of output dynamics. The networks obtained from this process show a wide variety of topologies and numbers of genes indicating that it is relatively easy to evolve these classes of dynamics in this model. (c) 2006 Elsevier Ireland Ltd. All rights reserved.
Resumo:
International trade and investment economies are highly integrated and interdependent and can be exploited by organized, international terrorism. The network of inter dependencies in the international economy means that a terrorist attack has the potential to disrupt the functioning of the network, so the effects can reverberate around the world. Governments can control the distributed effects of terrorism by auditing industrial networks to reveal and protect critical hubs and by promoting flexibility in production and distribution of goods and services to improve resilience in the economy. To explain these network effects, the authors draw on the new science of complex networks which has been applied to the physical sciences and is now increasingly being used to explain organizational and economic phenomena.
Resumo:
The virulence of Pseudomonas aeruginosa and other surface pathogens involves the coordinate expression of a wide range of virulence determinants, including type IV pili. These surface filaments are important for the colonization of host epithelial tissues and mediate bacterial attachment to, and translocation across, surfaces by a process known as twitching motility. This process is controlled in part by a complex signal transduction system whose central component, ChpA, possesses nine potential sites of phosphorylation, including six histidine-containing phosphotransfer (HPt) domains, one serine-containing phosphotransfer domain, one threonine-containing phosphotransfer domain, and one CheY-like receiver domain. Here, using site-directed mutagenesis, we show that normal twitching motility is entirely dependent on the CheY-like receiver domain and partially dependent on two of the HPt domains. Moreover, under different assay conditions, point mutations in several of the phosphotransfer domains of ChpA give rise to unusual "swarming" phenotypes, possibly reflecting more subtle perturbations in the control of P. aeruginosa motility that are not evident from the conventional twitching stab assay. Together, these results suggest that ChpA plays a central role in the complex regulation of type IV pilus-mediated motility in P. aeruginosa
Resumo:
MICE (meetings, incentives, conventions, and exhibitions), has generated high foreign exchange revenue for the economy worldwide. In Thailand, MICE tourists are recognized as ‘quality’ visitors, mainly because of their high-spending potential. Having said that, Thailand’s MICE sector has been influenced by a number of crises following September 11, 2001. Consequently, professionals in the MICE sector must be prepared to deal with such complex phenomena of crisis that might happen in the future. While a number of researches have examined the complexity of crises in the tourism context, there has been little focus on such issues in the MICE sector. As chaos theory provides a particularly good model for crisis situations, it is the aim of this paper to propose a chaos theory-based approach to the understanding of complex and chaotic system of the MICE sector in time of crisis.
Resumo:
Purpose – The international nuclear community continues to face the challenge of managing both the legacy waste and the new wastes that emerge from ongoing energy production. The UK is in the early stages of proposing a new convention for its nuclear industry, that is: waste minimisation through closely managing the radioactive source which creates the waste. This paper proposes a new technique (called waste and source material operability study (WASOP)) to qualitatively analyse a complex, waste-producing system to minimise avoidable waste and thus increase the protection to the public and the environment. Design/methodology/approach – WASOP critically considers the systemic impact of up and downstream facilities on the minimisation of nuclear waste in a facility. Based on the principles of HAZOP, the technique structures managers' thinking on the impact of mal-operations in interlinking facilities in order to identify preventative actions to reduce the impact on waste production of those mal-operations.' Findings – WASOP was tested with a small group of experienced nuclear regulators and was found to support their qualitative examination of waste minimisation and help them to work towards developing a plan of action. Originality/value – Given the newness of this convention, the wider methodology in which WASOP sits is still in development. However, this paper communicates the latest thinking from nuclear regulators on decision-making methodology for supporting waste minimisation and is hoped to form part of future regulatory guidance. WASOP is believed to have widespread potential application to the minimisation of many other forms of waste, including that from other energy sectors and household/general waste.