921 resultados para Norm-Divergence
Resumo:
Adapting to blurred images makes in-focus images look too sharp, and vice-versa (Webster et al, 2002 Nature Neuroscience 5 839 - 840). We asked how such blur adaptation is related to contrast adaptation. Georgeson (1985 Spatial Vision 1 103 - 112) found that grating contrast adaptation followed a subtractive rule: perceived (matched) contrast of a grating was fairly well predicted by subtracting some fraction k(~0.3) of the adapting contrast from the test contrast. Here we apply that rule to the responses of a set of spatial filters at different scales and orientations. Blur is encoded by the pattern of filter response magnitudes over scale. We tested two versions - the 'norm model' and 'fatigue model' - against blur-matching data obtained after adaptation to sharpened, in-focus or blurred images. In the fatigue model, filter responses are simply reduced by exposure to the adapter. In the norm model, (a) the visual system is pre-adapted to a focused world and (b) discrepancy between observed and expected responses to the experimental adapter leads to additional reduction (or enhancement) of filter responses during experimental adaptation. The two models are closely related, but only the norm model gave a satisfactory account of results across the four experiments analysed, with one free parameter k. This model implies that the visual system is pre-adapted to focused images, that adapting to in-focus or blank images produces no change in adaptation, and that adapting to sharpened or blurred images changes the state of adaptation, leading to changes in perceived blur or sharpness.
Resumo:
In this paper we propose a quantum algorithm to measure the similarity between a pair of unattributed graphs. We design an experiment where the two graphs are merged by establishing a complete set of connections between their nodes and the resulting structure is probed through the evolution of continuous-time quantum walks. In order to analyze the behavior of the walks without causing wave function collapse, we base our analysis on the recently introduced quantum Jensen-Shannon divergence. In particular, we show that the divergence between the evolution of two suitably initialized quantum walks over this structure is maximum when the original pair of graphs is isomorphic. We also prove that under special conditions the divergence is minimum when the sets of eigenvalues of the Hamiltonians associated with the two original graphs have an empty intersection.
Resumo:
In this paper we investigate the connection between quantum walks and graph symmetries. We begin by designing an experiment that allows us to analyze the behavior of the quantum walks on the graph without causing the wave function collapse. To achieve this, we base our analysis on the recently introduced quantum Jensen-Shannon divergence. In particular, we show that the quantum Jensen-Shannon divergence between the evolution of two quantum walks with suitably defined initial states is maximum when the graph presents symmetries. Hence, we assign to each pair of nodes of the graph a value of the divergence, and we average over all pairs of nodes to characterize the degree of symmetry possessed by a graph. © 2013 American Physical Society.
Resumo:
One of the most fundamental problem that we face in the graph domain is that of establishing the similarity, or alternatively the distance, between graphs. In this paper, we address the problem of measuring the similarity between attributed graphs. In particular, we propose a novel way to measure the similarity through the evolution of a continuous-time quantum walk. Given a pair of graphs, we create a derived structure whose degree of symmetry is maximum when the original graphs are isomorphic, and where a subset of the edges is labeled with the similarity between the respective nodes. With this compositional structure to hand, we compute the density operators of the quantum systems representing the evolution of two suitably defined quantum walks. We define the similarity between the two original graphs as the quantum Jensen-Shannon divergence between these two density operators, and then we show how to build a novel kernel on attributed graphs based on the proposed similarity measure. We perform an extensive experimental evaluation both on synthetic and real-world data, which shows the effectiveness the proposed approach. © 2013 Springer-Verlag.
Resumo:
The quantum Jensen-Shannon divergence kernel [1] was recently introduced in the context of unattributed graphs where it was shown to outperform several commonly used alternatives. In this paper, we study the separability properties of this kernel and we propose a way to compute a low-dimensional kernel embedding where the separation of the different classes is enhanced. The idea stems from the observation that the multidimensional scaling embeddings on this kernel show a strong horseshoe shape distribution, a pattern which is known to arise when long range distances are not estimated accurately. Here we propose to use Isomap to embed the graphs using only local distance information onto a new vectorial space with a higher class separability. The experimental evaluation shows the effectiveness of the proposed approach. © 2013 Springer-Verlag.
Resumo:
2002 Mathematics Subject Classification: 35J15, 35J25, 35B05, 35B50
Resumo:
The purpose of this article is to investigate in which ways multi-level actor cooperation advances national and local implementation processes of human rights norms in weak-state contexts. Examining the cases of women’s rights in Bosnia and Herzegovina and children’s rights in Bangladesh, we comparatively point to some advantages and disadvantages cooperative relations between international organisations, national governments and local NGOs can entail. Whereas these multi-level actor constellations (MACs) usually initiate norm implementation processes reliably and compensate governmental deficits, they are not always sustainable in the long run. If international organisations withdraw support from temporary missions or policy projects, local NGOs are not able to perpetuate implementation activities if state capacities have not been strengthened by MACs. Our aim is to highlight functions of local agency within multi-level cooperation and to critically raise sustainability issues in human rights implementation to supplement norm research in International Relations.
Resumo:
Analysis of risk measures associated with price series data movements and its predictions are of strategic importance in the financial markets as well as to policy makers in particular for short- and longterm planning for setting up economic growth targets. For example, oilprice risk-management focuses primarily on when and how an organization can best prevent the costly exposure to price risk. Value-at-Risk (VaR) is the commonly practised instrument to measure risk and is evaluated by analysing the negative/positive tail of the probability distributions of the returns (profit or loss). In modelling applications, least-squares estimation (LSE)-based linear regression models are often employed for modeling and analyzing correlated data. These linear models are optimal and perform relatively well under conditions such as errors following normal or approximately normal distributions, being free of large size outliers and satisfying the Gauss-Markov assumptions. However, often in practical situations, the LSE-based linear regression models fail to provide optimal results, for instance, in non-Gaussian situations especially when the errors follow distributions with fat tails and error terms possess a finite variance. This is the situation in case of risk analysis which involves analyzing tail distributions. Thus, applications of the LSE-based regression models may be questioned for appropriateness and may have limited applicability. We have carried out the risk analysis of Iranian crude oil price data based on the Lp-norm regression models and have noted that the LSE-based models do not always perform the best. We discuss results from the L1, L2 and L∞-norm based linear regression models. ACM Computing Classification System (1998): B.1.2, F.1.3, F.2.3, G.3, J.2.
Resumo:
2000 Mathematics Subject Classification: 53C42, 53C55.
Resumo:
2000 Mathematics Subject Classification: 47A10, 47A12, 47A30, 47B10, 47B20, 47B37, 47B47, 47D50.
Resumo:
In this paper, we first present a simple but effective L1-norm-based two-dimensional principal component analysis (2DPCA). Traditional L2-norm-based least squares criterion is sensitive to outliers, while the newly proposed L1-norm 2DPCA is robust. Experimental results demonstrate its advantages. © 2006 IEEE.
Resumo:
Tensor analysis plays an important role in modern image and vision computing problems. Most of the existing tensor analysis approaches are based on the Frobenius norm, which makes them sensitive to outliers. In this paper, we propose L1-norm-based tensor analysis (TPCA-L1), which is robust to outliers. Experimental results upon face and other datasets demonstrate the advantages of the proposed approach. © 2006 IEEE.
Resumo:
The aim of this paper is to analyze the political, social and economic background of the divergence of Belarusian and Ukrainian transitions. We focus on Belarus in order to find explanation for questions such as why could Lukashenko remain the authoritarian leader of Belarus, while in Ukraine the position of the political elite had proved less stable and collapsed in 2004. On the theoretical framework of elite-sociology, we seek to determine whether the internal factors (as macroeconomic conditions, standard of living, the oppressive nature of the political system and the structure of the political elite) play a significant role in the operation of the domino effect. This article emphasises the determining role of immanent internal factors, thus the political stability in Belarus can be explained by the role of the suppressing political regime, the hindrance of democratic rights and the relatively good living conditions that followed the transformational recession. Whilst in Ukraine, the markedly different circumstances brought forth the success of the Orange Revolution.
Resumo:
Using a panel of 21 OECD countries and 40 years of annual data, we find that countries with similar government budget positions tend to have business cycles that fluctuate more closely. That is, fiscal convergence (in the form of persistently similar ratios of government surplus/deficit to GDP) is systematically associated with more synchronized business cycles. We also find evidence that reduced fiscal deficits increase business cycle synchronization. The Maastricht "convergence criteria," used to determine eligibility for EMU, encouraged fiscal convergence and deficit reduction. They may thus have indirectly moved Europe closer to an optimum currency area, by reducing countries’ abilities to create idiosyncratic fiscal shocks. Our empirical results are economically and statistically significant, and robust.
Resumo:
A világ 115 országának - köztük 21 OECD-tagország - 40 évnyi adatait vizsgálva, arra a következtetésre jutottunk, hogy a hasonló állami költségvetési pozíciójú országok konjunktúraciklusai között szorosabb együttmozgás mutatható ki. Azaz, a fiskális konvergenciát (amelyet a költségvetési egyenleg GDP-hez viszonyt arányának konvergenciájaként definiáltunk) összehangoltabb konjunktúraciklusokkal lehet összefüggésbe hozni. Kutatásaink során arra is találtunk bizonyítékot, hogy a kisebb mértékű költségvetési deficitek növelik a konjunktúraciklusok együttmozgását. A maastrichti konvergenciakritériumok - amelyek az európai monetáris unió követelményeinek való megfelelést hivatottak meghatározni - a fiskális konvergenciát és a költségvetési deficit csökkentését ösztönözték, s ezzel közvetett módon hozzásegítették Európát egy optimális valutaövezet létrehozásához azáltal, hogy csökkent az egyes országok lehetősége a felelőtlen fiskális politika által gerjesztett sokkhatások létrehozására. Az általunk feltárt empirikus eredmények gazdasági és statisztikai szempontból is szignifikánsak és robusztusak. _____ Using panels of 115 countries of world – including 21 OECD countries – and 40 years of annual data, the authors find that countries with similar government budget positions tend to have business cycles that fluctuate more closely. Thus fiscal convergence (in the form of persistently similar ratios of government surplus/deficit to GDP) is systemati-cally associated with more strongly synchronized business cycles. Evidence is also found that reduced fiscal deficits increase business-cycle synchronization. The Maastricht "con-vergence criteria", used to determine eligibility for EMU, encouraged fiscal convergence and deficit reduction. So they may, indirectly, have moved Europe closer to an optimum currency area, by reducing countries abilities to create idiosyncratic fiscal shocks. The empirical results of the study are economically and statistically significant, and robust.