13 resultados para Weighted graph matching

em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper addresses the functional reliability and the complexity of reconfigurable antennas using graph models. The correlation between complexity and reliability for any given reconfigurable antenna is defined. Two methods are proposed to reduce failures and improve the reliability of reconfigurable antennas. The failures are caused by the reconfiguration technique or by the surrounding environment. These failure reduction methods proposed are tested and examples are given which verify these methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Sznajd model is a sociophysics model that is used to model opinion propagation and consensus formation in societies. Its main feature is that its rules favor bigger groups of agreeing people. In a previous work, we generalized the bounded confidence rule in order to model biases and prejudices in discrete opinion models. In that work, we applied this modification to the Sznajd model and presented some preliminary results. The present work extends what we did in that paper. We present results linking many of the properties of the mean-field fixed points, with only a few qualitative aspects of the confidence rule (the biases and prejudices modeled), finding an interesting connection with graph theory problems. More precisely, we link the existence of fixed points with the notion of strongly connected graphs and the stability of fixed points with the problem of finding the maximal independent sets of a graph. We state these results and present comparisons between the mean field and simulations in Barabasi-Albert networks, followed by the main mathematical ideas and appendices with the rigorous proofs of our claims and some graph theory concepts, together with examples. We also show that there is no qualitative difference in the mean-field results if we require that a group of size q > 2, instead of a pair, of agreeing agents be formed before they attempt to convince other sites (for the mean field, this would coincide with the q-voter model).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we address the "skull-stripping" problem in 3D MR images. We propose a new method that employs an efficient and unique histogram analysis. A fundamental component of this analysis is an algorithm for partitioning a histogram based on the position of the maximum deviation from a Gaussian fit. In our experiments we use a comprehensive image database, including both synthetic and real MRI. and compare our method with other two well-known methods, namely BSE and BET. For all datasets we achieved superior results. Our method is also highly independent of parameter tuning and very robust across considerable variations of noise ratio.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The expansion of sugarcane growing in Brazil, spurred particularly by increased demand for ethanol, has triggered the need to evaluate the economic, social, and environmental impacts of this process, both on the country as a whole and on the growing regions. Even though the balance of costs and benefits is positive from an overall standpoint, this may not be so in specific producing regions, due to negative externalities. The objective of this paper is to estimate the effect of growing sugarcane on the human development index (HDI) and its sub-indices in cane producing regions. In the literature on matching effects, this is interpreted as the effect of the treatment on the treated. Location effects are controlled by spatial econometric techniques, giving rise to the spatial propensity score matching model. The authors analyze 424 minimum comparable areas (MCAs) in the treatment group, compared with 907 MCAs in the control group. The results suggest that the presence of sugarcane growing in these areas is not relevant to determine their social conditions, whether for better or worse. It is thus likely that public policies, especially those focused directly on improving education, health, and income generation/distribution, have much more noticeable effects on the municipal HDI.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Clinicians frequently have to decide when dialysis should be initiated and which modality should be used to support kidney function in critically ill patients with acute kidney injury. In most instances, these decisions are made based on the consideration of a variety of factors including patient condition, available resources and prevailing local practice experience. There is a wide variation worldwide in how these factors influence the timing of initiation and the utilization of various modalities. In this article, we review the therapeutic goals of renal support and the relative advantages and shortcomings of different dialysis techniques. We describe strategies for matching the timing of initiation to the choice of modality to individualize renal support in intensive care unit patients. Copyright (C) 2012 S. Karger AG, Basel

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Inthispaperwestudygermsofpolynomialsformedbytheproductofsemi-weighted homogeneous polynomials of the same type, which we call semi-weighted homogeneous arrangements. It is shown how the L numbers of such polynomials are computed using only their weights and degree of homogeneity. A key point of the main theorem is to find the number called polar ratio of this polynomial class. An important consequence is the description of the Euler characteristic of the Milnor fibre of such arrangements only depending on their weights and degree of homogeneity. The constancy of the L numbers in families formed by such arrangements is shown, with the deformed terms having weighted degree greater than the weighted degree of the initial germ. Moreover, using the results of Massey applied to families of function germs, we obtain the constancy of the homology of the Milnor fibre in this family of semi-weighted homogeneous arrangements.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Let k and l be positive integers. With a graph G, we associate the quantity c(k,l)(G), the number of k-colourings of the edge set of G with no monochromatic matching of size l. Consider the function c(k,l) : N --> N given by c(k,l)(n) = max {c(k,l)(G): vertical bar V(G)vertical bar = n}, the maximum of c(k,l)(G) over all graphs G on n vertices. In this paper, we determine c(k,l)(n) and the corresponding extremal graphs for all large n and all fixed values of k and l.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A deep theoretical analysis of the graph cut image segmentation framework presented in this paper simultaneously translates into important contributions in several directions. The most important practical contribution of this work is a full theoretical description, and implementation, of a novel powerful segmentation algorithm, GC(max). The output of GC(max) coincides with a version of a segmentation algorithm known as Iterative Relative Fuzzy Connectedness, IRFC. However, GC(max) is considerably faster than the classic IRFC algorithm, which we prove theoretically and show experimentally. Specifically, we prove that, in the worst case scenario, the GC(max) algorithm runs in linear time with respect to the variable M=|C|+|Z|, where |C| is the image scene size and |Z| is the size of the allowable range, Z, of the associated weight/affinity function. For most implementations, Z is identical to the set of allowable image intensity values, and its size can be treated as small with respect to |C|, meaning that O(M)=O(|C|). In such a situation, GC(max) runs in linear time with respect to the image size |C|. We show that the output of GC(max) constitutes a solution of a graph cut energy minimization problem, in which the energy is defined as the a"" (a) norm ayenF (P) ayen(a) of the map F (P) that associates, with every element e from the boundary of an object P, its weight w(e). This formulation brings IRFC algorithms to the realm of the graph cut energy minimizers, with energy functions ayenF (P) ayen (q) for qa[1,a]. Of these, the best known minimization problem is for the energy ayenF (P) ayen(1), which is solved by the classic min-cut/max-flow algorithm, referred to often as the Graph Cut algorithm. We notice that a minimization problem for ayenF (P) ayen (q) , qa[1,a), is identical to that for ayenF (P) ayen(1), when the original weight function w is replaced by w (q) . Thus, any algorithm GC(sum) solving the ayenF (P) ayen(1) minimization problem, solves also one for ayenF (P) ayen (q) with qa[1,a), so just two algorithms, GC(sum) and GC(max), are enough to solve all ayenF (P) ayen (q) -minimization problems. We also show that, for any fixed weight assignment, the solutions of the ayenF (P) ayen (q) -minimization problems converge to a solution of the ayenF (P) ayen(a)-minimization problem (ayenF (P) ayen(a)=lim (q -> a)ayenF (P) ayen (q) is not enough to deduce that). An experimental comparison of the performance of GC(max) and GC(sum) algorithms is included. This concentrates on comparing the actual (as opposed to provable worst scenario) algorithms' running time, as well as the influence of the choice of the seeds on the output.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, a new algebraic-graph method for identification of islanding in power system grids is proposed. The proposed method identifies all the possible cases of islanding, due to the loss of a equipment, by means of a factorization of the bus-branch incidence matrix. The main features of this new method include: (i) simple implementation, (ii) high speed, (iii) real-time adaptability, (iv) identification of all islanding cases and (v) identification of the buses that compose each island in case of island formation. The method was successfully tested on large-scale systems such as the reduced south Brazilian system (45 buses/72 branches) and the south-southeast Brazilian system (810 buses/1340 branches). (C) 2011 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Support for the adverse effect of high income inequality on population health has come from studies that focus on larger areas, such as the US states, while studies at smaller geographical areas (eg, neighbourhoods) have found mixed results. Methods We used propensity score matching to examine the relationship between income inequality and mortality rates across 96 neighbourhoods (distritos) of the municipality of Sao Paulo, Brazil. Results Prior to matching, higher income inequality distritos (Gini >= 0.25) had slightly lower overall mortality rates (2.23 per 10 000, 95% CI -23.92 to 19.46) compared to lower income inequality areas (Gini <0.25). After propensity score matching, higher inequality was associated with a statistically significant higher mortality rate (41.58 per 10 000, 95% CI 8.85 to 73.3). Conclusion In Sao Paulo, the more egalitarian communities are among some of the poorest, with the worst health profiles. Propensity score matching was used to avoid inappropriate comparisons between the health status of unequal (but wealthy) neighbourhoods versus equal (but poor) neighbourhoods. Our methods suggest that, with proper accounting of heterogeneity between areas, income inequality is associated with worse population health in Sao Paulo.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this article, we propose a new Bayesian flexible cure rate survival model, which generalises the stochastic model of Klebanov et al. [Klebanov LB, Rachev ST and Yakovlev AY. A stochastic-model of radiation carcinogenesis - latent time distributions and their properties. Math Biosci 1993; 113: 51-75], and has much in common with the destructive model formulated by Rodrigues et al. [Rodrigues J, de Castro M, Balakrishnan N and Cancho VG. Destructive weighted Poisson cure rate models. Technical Report, Universidade Federal de Sao Carlos, Sao Carlos-SP. Brazil, 2009 (accepted in Lifetime Data Analysis)]. In our approach, the accumulated number of lesions or altered cells follows a compound weighted Poisson distribution. This model is more flexible than the promotion time cure model in terms of dispersion. Moreover, it possesses an interesting and realistic interpretation of the biological mechanism of the occurrence of the event of interest as it includes a destructive process of tumour cells after an initial treatment or the capacity of an individual exposed to irradiation to repair altered cells that results in cancer induction. In other words, what is recorded is only the damaged portion of the original number of altered cells not eliminated by the treatment or repaired by the repair system of an individual. Markov Chain Monte Carlo (MCMC) methods are then used to develop Bayesian inference for the proposed model. Also, some discussions on the model selection and an illustration with a cutaneous melanoma data set analysed by Rodrigues et al. [Rodrigues J, de Castro M, Balakrishnan N and Cancho VG. Destructive weighted Poisson cure rate models. Technical Report, Universidade Federal de Sao Carlos, Sao Carlos-SP. Brazil, 2009 (accepted in Lifetime Data Analysis)] are presented.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Abstract Background Recently, it was realized that the functional connectivity networks estimated from actual brain-imaging technologies (MEG, fMRI and EEG) can be analyzed by means of the graph theory, that is a mathematical representation of a network, which is essentially reduced to nodes and connections between them. Methods We used high-resolution EEG technology to enhance the poor spatial information of the EEG activity on the scalp and it gives a measure of the electrical activity on the cortical surface. Afterwards, we used the Directed Transfer Function (DTF) that is a multivariate spectral measure for the estimation of the directional influences between any given pair of channels in a multivariate dataset. Finally, a graph theoretical approach was used to model the brain networks as graphs. These methods were used to analyze the structure of cortical connectivity during the attempt to move a paralyzed limb in a group (N=5) of spinal cord injured patients and during the movement execution in a group (N=5) of healthy subjects. Results Analysis performed on the cortical networks estimated from the group of normal and SCI patients revealed that both groups present few nodes with a high out-degree value (i.e. outgoing links). This property is valid in the networks estimated for all the frequency bands investigated. In particular, cingulate motor areas (CMAs) ROIs act as ‘‘hubs’’ for the outflow of information in both groups, SCI and healthy. Results also suggest that spinal cord injuries affect the functional architecture of the cortical network sub-serving the volition of motor acts mainly in its local feature property. In particular, a higher local efficiency El can be observed in the SCI patients for three frequency bands, theta (3-6 Hz), alpha (7-12 Hz) and beta (13-29 Hz). By taking into account all the possible pathways between different ROI couples, we were able to separate clearly the network properties of the SCI group from the CTRL group. In particular, we report a sort of compensatory mechanism in the SCI patients for the Theta (3-6 Hz) frequency band, indicating a higher level of “activation” Ω within the cortical network during the motor task. The activation index is directly related to diffusion, a type of dynamics that underlies several biological systems including possible spreading of neuronal activation across several cortical regions. Conclusions The present study aims at demonstrating the possible applications of graph theoretical approaches in the analyses of brain functional connectivity from EEG signals. In particular, the methodological aspects of the i) cortical activity from scalp EEG signals, ii) functional connectivity estimations iii) graph theoretical indexes are emphasized in the present paper to show their impact in a real application.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We study the action of a weighted Fourier–Laplace transform on the functions in the reproducing kernel Hilbert space (RKHS) associated with a positive definite kernel on the sphere. After defining a notion of smoothness implied by the transform, we show that smoothness of the kernel implies the same smoothness for the generating elements (spherical harmonics) in the Mercer expansion of the kernel. We prove a reproducing property for the weighted Fourier–Laplace transform of the functions in the RKHS and embed the RKHS into spaces of smooth functions. Some relevant properties of the embedding are considered, including compactness and boundedness. The approach taken in the paper includes two important notions of differentiability characterized by weighted Fourier–Laplace transforms: fractional derivatives and Laplace–Beltrami derivatives.