49 resultados para Paths and cycles (Graph theory).
em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain
Resumo:
HEMOLIA (a project under European community’s 7th framework programme) is a new generation Anti-Money Laundering (AML) intelligent multi-agent alert and investigation system which in addition to the traditional financial data makes extensive use of modern society’s huge telecom data source, thereby opening up a new dimension of capabilities to all Money Laundering fighters (FIUs, LEAs) and Financial Institutes (Banks, Insurance Companies, etc.). This Master-Thesis project is done at AIA, one of the partners for the HEMOLIA project in Barcelona. The objective of this thesis is to find the clusters in a network drawn by using the financial data. An extensive literature survey has been carried out and several standard algorithms related to networks have been studied and implemented. The clustering problem is a NP-hard problem and several algorithms like K-Means and Hierarchical clustering are being implemented for studying several problems relating to sociology, evolution, anthropology etc. However, these algorithms have certain drawbacks which make them very difficult to implement. The thesis suggests (a) a possible improvement to the K-Means algorithm, (b) a novel approach to the clustering problem using the Genetic Algorithms and (c) a new algorithm for finding the cluster of a node using the Genetic Algorithm.
Resumo:
We test in the laboratory the potential of evolutionary dynamics as predictor of actual behavior. To this end, we propose an asymmetricgame -which we interpret as a borrowerlender relation-, study itsevolutionary dynamics in a random matching set-up, and tests itspredictions. The model provides conditions for the existence ofcredit markets and credit cycles. The theoretical predictions seemto be good approximations of the experimental results.
Resumo:
The object of this project is to schedule a ctitious European basketball competition with many teams situated a long distances. The schedule must be fair, feasible and economical, which means that the total distance trav- eled by every team must be the minimal possible. First, we de ne the sport competition terminology and study di erent competition systems, focusing on the NBA and the Euroleague systems. Then we de ne concepts of graph theory and spherical distance that will be needed. Next we propose a com- petition system, explaining where will be allocated the teams and how will be the scheduling. Then there is a description of the programs that have been implemented, and, nally, the complete schedule is displayed, and some possible improvements are mentioned.
Resumo:
The first main result of the paper is a criterion for a partially commutative group G to be a domain. It allows us to reduce the study of algebraic sets over G to the study of irreducible algebraic sets, and reduce the elementary theory of G (of a coordinate group over G) to the elementary theories of the direct factors of G (to the elementary theory of coordinate groups of irreducible algebraic sets). Then we establish normal forms for quantifier-free formulas over a non-abelian directly indecomposable partially commutative group H. Analogously to the case of free groups, we introduce the notion of a generalised equation and prove that the positive theory of H has quantifier elimination and that arbitrary first-order formulas lift from H to H * F, where F is a free group of finite rank. As a consequence, the positive theory of an arbitrary partially commutative group is decidable.
Resumo:
The dissertation accomplishes two aims: 1) to diagnose what prevents true beliefs from being knowledge; 2) to give an positive account of knowledge. Concerning the first aim, it offers an account of the notion of luck. It defends the view that luck is a form of risk and distinguishes two types of luck. Then, it applies the account to the problem of epistemic luck and distinguishes, accordingly, two types of epistemic luck. It is argued that these two types of epistemic luck explain the whole range of cases of not-known true belief. Concerning the second aim, the dissertation advances an account of knowledge in terms of the notion of cognitive control that deals with the two forms of epistemic luck distinguished.
Resumo:
I formulate and estimate a model of externalities within countriesand technological interdependence across countries. I find that externalreturns to scale to physical capital within countries are 8 percent; thata 10 percent increase of total factor productivity of a country's neighborsraises its total factor productivity by 6 percent; and that a 2 percentannual growth rate of labor productivity can be explained as an endogenousresponse to an exogenous 0.2 percent annual growth rate of total factorproductivity in the steady--state.
Resumo:
We prove an arithmetic version of a theorem of Hirzebruch and Zagier saying that Hirzebruch-Zagier divisors on a Hilbert modular surface are the coefficients of an elliptic modular form of weight 2. Moreover, we determine the arithmetic selfintersection number of the line bundle of modular forms equipped with its Petersson metric on a regular model of a Hilbert modular surface, and we study Faltings heights of arithmetic Hirzebruch-Zagier divisors.
Resumo:
We present a new phenomenological approach to nucleation, based on the combination of the extended modified liquid drop model and dynamical nucleation theory. The new model proposes a new cluster definition, which properly includes the effect of fluctuations, and it is consistent both thermodynamically and kinetically. The model is able to predict successfully the free energy of formation of the critical nucleus, using only macroscopic thermodynamic properties. It also accounts for the spinodal and provides excellent agreement with the result of recent simulations.
Resumo:
Bimodal dispersal probability distributions with characteristic distances differing by several orders of magnitude have been derived and favorably compared to observations by Nathan [Nature (London) 418, 409 (2002)]. For such bimodal kernels, we show that two-dimensional molecular dynamics computer simulations are unable to yield accurate front speeds. Analytically, the usual continuous-space random walks (CSRWs) are applied to two dimensions. We also introduce discrete-space random walks and use them to check the CSRW results (because of the inefficiency of the numerical simulations). The physical results reported are shown to predict front speeds high enough to possibly explain Reid's paradox of rapid tree migration. We also show that, for a time-ordered evolution equation, fronts are always slower in two dimensions than in one dimension and that this difference is important both for unimodal and for bimodal kernels
Resumo:
Graph pebbling is a network model for studying whether or not a given supply of discrete pebbles can satisfy a given demand via pebbling moves. A pebbling move across an edge of a graph takes two pebbles from one endpoint and places one pebble at the other endpoint; the other pebble is lost in transit as a toll. It has been shown that deciding whether a supply can meet a demand on a graph is NP-complete. The pebbling number of a graph is the smallest t such that every supply of t pebbles can satisfy every demand of one pebble. Deciding if the pebbling number is at most k is NP 2 -complete. In this paper we develop a tool, called theWeight Function Lemma, for computing upper bounds and sometimes exact values for pebbling numbers with the assistance of linear optimization. With this tool we are able to calculate the pebbling numbers of much larger graphs than in previous algorithms, and much more quickly as well. We also obtain results for many families of graphs, in many cases by hand, with much simpler and remarkably shorter proofs than given in previously existing arguments (certificates typically of size at most the number of vertices times the maximum degree), especially for highly symmetric graphs. Here we apply theWeight Function Lemma to several specific graphs, including the Petersen, Lemke, 4th weak Bruhat, Lemke squared, and two random graphs, as well as to a number of infinite families of graphs, such as trees, cycles, graph powers of cycles, cubes, and some generalized Petersen and Coxeter graphs. This partly answers a question of Pachter, et al., by computing the pebbling exponent of cycles to within an asymptotically small range. It is conceivable that this method yields an approximation algorithm for graph pebbling.
Resumo:
Most network operators have considered reducing Label Switched Routers (LSR) label spaces (i.e. the number of labels that can be used) as a means of simplifying management of underlaying Virtual Private Networks (VPNs) and, hence, reducing operational expenditure (OPEX). This letter discusses the problem of reducing the label spaces in Multiprotocol Label Switched (MPLS) networks using label merging - better known as MultiPoint-to-Point (MP2P) connections. Because of its origins in IP, MP2P connections have been considered to have tree- shapes with Label Switched Paths (LSP) as branches. Due to this fact, previous works by many authors affirm that the problem of minimizing the label space using MP2P in MPLS - the Merging Problem - cannot be solved optimally with a polynomial algorithm (NP-complete), since it involves a hard- decision problem. However, in this letter, the Merging Problem is analyzed, from the perspective of MPLS, and it is deduced that tree-shapes in MP2P connections are irrelevant. By overriding this tree-shape consideration, it is possible to perform label merging in polynomial time. Based on how MPLS signaling works, this letter proposes an algorithm to compute the minimum number of labels using label merging: the Full Label Merging algorithm. As conclusion, we reclassify the Merging Problem as Polynomial-solvable, instead of NP-complete. In addition, simulation experiments confirm that without the tree-branch selection problem, more labels can be reduced
Resumo:
A monoclonal antibody CC92 (IgM), raised against a fraction of rat liver enriched in Golgi membranes, recognizes a novel Endo H-resistant 74-kD membrane glycoprotein (gp74). The bulk of gp74 is confined to the cis-Golgi network (CGN). Outside the Golgi gp74 is found in tubulovesicular structures and ER foci. In cells incubated at 37 degrees C the majority of gp74 is segregated from the intermediate compartment (IC) marker p58. However, in cells treated with organelle perturbants such as low temperature, BFA, and [AIF4]- the patterns of the two proteins become indistinguishable. Both proteins are retained in the Golgi complex at 20 degrees C and in the IC at 15 degrees C. Incubation of cells with BFA results in relocation of gp74 to p58 positive IC elements. [AIF4]- induces the redistribution of gp74 from the Golgi to p58-positive vesicles and does not retard the translocation of gp74 to IC elements in cells treated with BFA. Disruption of microtubules by nocodazol results in the rapid disappearance of the Golgi elements stained by gp74 and redistribution of the protein into vesicle-like structures. The responses of gp74 to cell perturbants are in sharp contrast with those of cis/middle and trans-Golgi resident proteins whose location is not affected by low temperatures or [AIF4]-, are translocated to the ER upon addition of BFA, and stay in slow disintegrating Golgi elements in cells treated with nocodazol. The results suggest that gp74 is an itinerant protein that resides most of the time in the CGN and cycles through the ER/IC following the pathway used by p58.
Resumo:
Two graphs with adjacency matrices $\mathbf{A}$ and $\mathbf{B}$ are isomorphic if there exists a permutation matrix $\mathbf{P}$ for which the identity $\mathbf{P}^{\mathrm{T}} \mathbf{A} \mathbf{P} = \mathbf{B}$ holds. Multiplying through by $\mathbf{P}$ and relaxing the permutation matrix to a doubly stochastic matrix leads to the linear programming relaxation known as fractional isomorphism. We show that the levels of the Sherali--Adams (SA) hierarchy of linear programming relaxations applied to fractional isomorphism interleave in power with the levels of a well-known color-refinement heuristic for graph isomorphism called the Weisfeiler--Lehman algorithm, or, equivalently, with the levels of indistinguishability in a logic with counting quantifiers and a bounded number of variables. This tight connection has quite striking consequences. For example, it follows immediately from a deep result of Grohe in the context of logics with counting quantifiers that a fixed number of levels of SA suffice to determine isomorphism of planar and minor-free graphs. We also offer applications in both finite model theory and polyhedral combinatorics. First, we show that certain properties of graphs, such as that of having a flow circulation of a prescribed value, are definable in the infinitary logic with counting with a bounded number of variables. Second, we exploit a lower bound construction due to Cai, Fürer, and Immerman in the context of counting logics to give simple explicit instances that show that the SA relaxations of the vertex-cover and cut polytopes do not reach their integer hulls for up to $\Omega(n)$ levels, where $n$ is the number of vertices in the graph.
Resumo:
The identification of biomarkers of vascular cognitive impairment is urgent for its early diagnosis. The aim of this study was to detect and monitor changes in brain structure and connectivity, and to correlate them with the decline in executive function. We examined the feasibility of early diagnostic magnetic resonance imaging (MRI) to predict cognitive impairment before onset in an animal model of chronic hypertension: Spontaneously Hypertensive Rats. Cognitive performance was tested in an operant conditioning paradigm that evaluated learning, memory, and behavioral flexibility skills. Behavioral tests were coupled with longitudinal diffusion weighted imaging acquired with 126 diffusion gradient directions and 0.3 mm(3) isometric resolution at 10, 14, 18, 22, 26, and 40 weeks after birth. Diffusion weighted imaging was analyzed in two different ways, by regional characterization of diffusion tensor imaging (DTI) indices, and by assessing changes in structural brain network organization based on Q-Ball tractography. Already at the first evaluated times, DTI scalar maps revealed significant differences in many regions, suggesting loss of integrity in white and gray matter of spontaneously hypertensive rats when compared to normotensive control rats. In addition, graph theory analysis of the structural brain network demonstrated a significant decrease of hierarchical modularity, global and local efficacy, with predictive value as shown by regional three-fold cross validation study. Moreover, these decreases were significantly correlated with the behavioral performance deficits observed at subsequent time points, suggesting that the diffusion weighted imaging and connectivity studies can unravel neuroimaging alterations even overt signs of cognitive impairment become apparent.
Resumo:
The decisions of many individuals and social groups, taking according to well-defined objectives, are causing serious social and environmental problems, in spite of following the dictates of economic rationality. There are many examples of serious problems for which there are not yet appropriate solutions, such as management of scarce natural resources including aquifer water or the distribution of space among incompatible uses. In order to solve these problems, the paper first characterizes the resources and goods involved from an economic perspective. Then, for each case, the paper notes that there is a serious divergence between individual and collective interests and, where possible, it designs the procedure for solving the conflict of interests. With this procedure, the real opportunities for the application of economic theory are shown, and especially the theory on collective goods and externalities. The limitations of conventional economic analysis are shown and the opportunity to correct the shortfalls is examined. Many environmental problems, such as climate change, have an impact on different generations that do not participate in present decisions. The paper shows that for these cases, the solutions suggested by economic theory are not valid. Furthermore, conventional methods of economic valuation (which usually help decision-makers) are unable to account for the existence of different generations and tend to obviate long-term impacts. The paper analyzes how economic valuation methods could account for the costs and benefits enjoyed by present and future generations. The paper studies an appropriate consideration of preferences for future consumption and the incorporation of sustainability as a requirement in social decisions, which implies not only more efficiency but also a fairer distribution between generations than the one implied by conventional economic analysis.