940 resultados para Graph Decomposition


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Verbal fluency is the ability to produce a satisfying sequence of spoken words during a given time interval. The core of verbal fluency lies in the capacity to manage the executive aspects of language. The standard scores of the semantic verbal fluency test are broadly used in the neuropsychological assessment of the elderly, and different analytical methods are likely to extract even more information from the data generated in this test. Graph theory, a mathematical approach to analyze relations between items, represents a promising tool to understand a variety of neuropsychological states. This study reports a graph analysis of data generated by the semantic verbal fluency test by cognitively healthy elderly (NC), patients with Mild Cognitive Impairment – subtypes amnestic(aMCI) and amnestic multiple domain (a+mdMCI) - and patients with Alzheimer’s disease (AD). Sequences of words were represented as a speech graph in which every word corresponded to a node and temporal links between words were represented by directed edges. To characterize the structure of the data we calculated 13 speech graph attributes (SGAs). The individuals were compared when divided in three (NC – MCI – AD) and four (NC – aMCI – a+mdMCI – AD) groups. When the three groups were compared, significant differences were found in the standard measure of correct words produced, and three SGA: diameter, average shortest path, and network density. SGA sorted the elderly groups with good specificity and sensitivity. When the four groups were compared, the groups differed significantly in network density, except between the two MCI subtypes and NC and aMCI. The diameter of the network and the average shortest path were significantly different between the NC and AD, and between aMCI and AD. SGA sorted the elderly in their groups with good specificity and sensitivity, performing better than the standard score of the task. These findings provide support for a new methodological frame to assess the strength of semantic memory through the verbal fluency task, with potential to amplify the predictive power of this test. Graph analysis is likely to become clinically relevant in neurology and psychiatry, and may be particularly useful for the differential diagnosis of the elderly.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Upgrade of hydrogen to valuable fuel is a central topic in modern research due to its high availability and low price. For the difficulties in hydrogen storage, different pathways are still under investigation. A promising way is in the liquid-phase chemical hydrogen storage materials, because they can lead to greener transformation processes with the on line development of hydrogen for fuel cells. The aim of my work was the optimization of catalysts for the decomposition of formic acid made by sol immobilisation method (a typical colloidal method). Formic acid was selected because of the following features: it is a versatile renewable reagent for green synthesis studies. The first aim of my research was the synthesis and optimisation of Pd nanoparticles by sol-immobilisation to achieve better catalytic performances and investigate the effect of particle size, oxidation state, role of stabiliser and nature of the support. Palladium was chosen because it is a well-known active metal for the catalytic decomposition of formic acid. Noble metal nanoparticles of palladium were immobilized on carbon charcoal and on titania. In the second part the catalytic performance of the “homemade” catalyst Pd/C to a commercial Pd/C and the effect of different monometallic and bimetallic systems (AuxPdy) in the catalytic formic acid decomposition was investigated. The training period for the production of this work was carried out at the University of Cardiff (Group of Dr. N. Dimitratos).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Co-Al-Ox mixed metal oxides partially modified with Cu or Mg, as well as Ag were successfully prepared, characterized and evaluated as potential catalysts for the N2O decomposition. The materials were characterized by the following techniques: X-Ray Diffraction, Thermogravimetric Analysis (TGA), N2 Physisorption, Hydrogen Temperature-Programmed Reduction (H2-TPR), and X-ray photoelectron spectroscopy (XPS). Ag-modified HT-derived mixed oxides showed enhanced activity compared to the undoped materials, the optimum composition was found for (1 wt.% Ag)CHT-Co3Al. The catalyst characterization studies suggested that the improved catalytic activity of Ag-promoted catalysts were mainly because of the altered redox properties of the materials.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Abstract not available

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Inverse heat conduction problems (IHCPs) appear in many important scientific and technological fields. Hence analysis, design, implementation and testing of inverse algorithms are also of great scientific and technological interest. The numerical simulation of 2-D and –D inverse (or even direct) problems involves a considerable amount of computation. Therefore, the investigation and exploitation of parallel properties of such algorithms are equally becoming very important. Domain decomposition (DD) methods are widely used to solve large scale engineering problems and to exploit their inherent ability for the solution of such problems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Finance is one of the fastest growing areas in modern applied mathematics with real world applications. The interest of this branch of applied mathematics is best described by an example involving shares. Shareholders of a company receive dividends which come from the profit made by the company. The proceeds of the company, once it is taken over or wound up, will also be distributed to shareholders. Therefore shares have a value that reflects the views of investors about the likely dividend payments and capital growth of the company. Obviously such value will be quantified by the share price on stock exchanges. Therefore financial modelling serves to understand the correlations between asset and movements of buy/sell in order to reduce risk. Such activities depend on financial analysis tools being available to the trader with which he can make rapid and systematic evaluation of buy/sell contracts. There are other financial activities and it is not an intention of this paper to discuss all of these activities. The main concern of this paper is to propose a parallel algorithm for the numerical solution of an European option. This paper is organised as follows. First, a brief introduction is given of a simple mathematical model for European options and possible numerical schemes of solving such mathematical model. Second, Laplace transform is applied to the mathematical model which leads to a set of parametric equations where solutions of different parametric equations may be found concurrently. Numerical inverse Laplace transform is done by means of an inversion algorithm developed by Stehfast. The scalability of the algorithm in a distributed environment is demonstrated. Third, a performance analysis of the present algorithm is compared with a spatial domain decomposition developed particularly for time-dependent heat equation. Finally, a number of issues are discussed and future work suggested.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Abstract not available

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A parallel method for the dynamic partitioning of unstructured meshes is described. The method introduces a new iterative optimisation technique known as relative gain optimisation which both balances the workload and attempts to minimise the interprocessor communications overhead. Experiments on a series of adaptively refined meshes indicate that the algorithm provides partitions of an equivalent or higher quality to static partitioners (which do not reuse the existing partition) and much more rapidly. Perhaps more importantly, the algorithm results in only a small fraction of the amount of data migration compared to the static partitioners.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The evaluation of relativistic spin networks plays a fundamental role in the Barrett-Crane state sum model of Lorentzian quantum gravity in 4 dimensions. A relativistic spin network is a graph labelled by unitary irreducible representations of the Lorentz group appearing in the direct integral decomposition of the space of L^2 functions on three-dimensional hyperbolic space. To `evaluate' such a spin network we must do an integral; if this integral converges we say the spin network is `integrable'. Here we show that a large class of relativistic spin networks are integrable, including any whose underlying graph is the 4-simplex (the complete graph on 5 vertices). This proves a conjecture of Barrett and Crane, whose validity is required for the convergence of their state sum model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Abstract The ultimate problem considered in this thesis is modeling a high-dimensional joint distribution over a set of discrete variables. For this purpose, we consider classes of context-specific graphical models and the main emphasis is on learning the structure of such models from data. Traditional graphical models compactly represent a joint distribution through a factorization justi ed by statements of conditional independence which are encoded by a graph structure. Context-speci c independence is a natural generalization of conditional independence that only holds in a certain context, speci ed by the conditioning variables. We introduce context-speci c generalizations of both Bayesian networks and Markov networks by including statements of context-specific independence which can be encoded as a part of the model structures. For the purpose of learning context-speci c model structures from data, we derive score functions, based on results from Bayesian statistics, by which the plausibility of a structure is assessed. To identify high-scoring structures, we construct stochastic and deterministic search algorithms designed to exploit the structural decomposition of our score functions. Numerical experiments on synthetic and real-world data show that the increased exibility of context-specific structures can more accurately emulate the dependence structure among the variables and thereby improve the predictive accuracy of the models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the past decade, systems that extract information from millions of Internet documents have become commonplace. Knowledge graphs -- structured knowledge bases that describe entities, their attributes and the relationships between them -- are a powerful tool for understanding and organizing this vast amount of information. However, a significant obstacle to knowledge graph construction is the unreliability of the extracted information, due to noise and ambiguity in the underlying data or errors made by the extraction system and the complexity of reasoning about the dependencies between these noisy extractions. My dissertation addresses these challenges by exploiting the interdependencies between facts to improve the quality of the knowledge graph in a scalable framework. I introduce a new approach called knowledge graph identification (KGI), which resolves the entities, attributes and relationships in the knowledge graph by incorporating uncertain extractions from multiple sources, entity co-references, and ontological constraints. I define a probability distribution over possible knowledge graphs and infer the most probable knowledge graph using a combination of probabilistic and logical reasoning. Such probabilistic models are frequently dismissed due to scalability concerns, but my implementation of KGI maintains tractable performance on large problems through the use of hinge-loss Markov random fields, which have a convex inference objective. This allows the inference of large knowledge graphs using 4M facts and 20M ground constraints in 2 hours. To further scale the solution, I develop a distributed approach to the KGI problem which runs in parallel across multiple machines, reducing inference time by 90%. Finally, I extend my model to the streaming setting, where a knowledge graph is continuously updated by incorporating newly extracted facts. I devise a general approach for approximately updating inference in convex probabilistic models, and quantify the approximation error by defining and bounding inference regret for online models. Together, my work retains the attractive features of probabilistic models while providing the scalability necessary for large-scale knowledge graph construction. These models have been applied on a number of real-world knowledge graph projects, including the NELL project at Carnegie Mellon and the Google Knowledge Graph.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents an investigation of a simple generic hyper-heuristic approach upon a set of widely used constructive heuristics (graph coloring heuristics) in timetabling. Within the hyperheuristic framework, a Tabu Search approach is employed to search for permutations of graph heuristics which are used for constructing timetables in exam and course timetabling problems. This underpins a multi-stage hyper-heuristic where the Tabu Search employs permutations upon a different number of graph heuristics in two stages. We study this graph-based hyper-heuristic approach within the context of exploring fundamental issues concerning the search space of the hyper-heuristic (the heuristic space) and the solution space. Such issues have not been addressed in other hyper-heuristic research. These approaches are tested on both exam and course benchmark timetabling problems and are compared with the fine-tuned bespoke state-of-the-art approaches. The results are within the range of the best results reported in the literature. The approach described here represents a significantly more generally applicable approach than the current state of the art in the literature. Future work will extend this hyper-heuristic framework by employing methodologies which are applicable on a wider range of timetabling and scheduling problems.