949 resultados para Curse of dimensionality
Resumo:
The modelling of a nonlinear stochastic dynamical processes from data involves solving the problems of data gathering, preprocessing, model architecture selection, learning or adaptation, parametric evaluation and model validation. For a given model architecture such as associative memory networks, a common problem in non-linear modelling is the problem of "the curse of dimensionality". A series of complementary data based constructive identification schemes, mainly based on but not limited to an operating point dependent fuzzy models, are introduced in this paper with the aim to overcome the curse of dimensionality. These include (i) a mixture of experts algorithm based on a forward constrained regression algorithm; (ii) an inherent parsimonious delaunay input space partition based piecewise local lineal modelling concept; (iii) a neurofuzzy model constructive approach based on forward orthogonal least squares and optimal experimental design and finally (iv) the neurofuzzy model construction algorithm based on basis functions that are Bézier Bernstein polynomial functions and the additive decomposition. Illustrative examples demonstrate their applicability, showing that the final major hurdle in data based modelling has almost been removed.
Resumo:
This volume is based upon the 2nd IEEE European Workshop on Computer-Intensive Methods in Control and Signal Processing, held in Prague, August 1996.
Resumo:
We propose a theoretical model to explain empirical regularities related to the curse of natural resources. This is an explicitly political model which emphasizes the behavior and incentives of politicians. We extend the standard voting model to give voters political control beyond the elections. This gives rise to a new restriction into our political economy model: policies should not give rise to a revolution. Our model clarifies when resource discoveries might lead to revolutions, namely, in countries with weak institutions. Natural resources may be bad for democracy by harming political turnover. Our model also suggests a non-linear dependence of human capital on natural resources. For low levels of democracy human capital depends negatively on natural resources, while for high levels of democracy the dependence is reversed. This theoretical finding is corroborated in both cross section and panel data regressions.
Resumo:
Constant interest rate (CIR) projections are often criticized on the grounds that they are inconsistent with the existence of a unique equilibrium in a variety of forward-looking models. This note shows howto construct CIR projections that are not subject to that criticism, using a standard New Keynesian model as a reference framework.
Resumo:
Foreign aid provides a windfall of resources to recipient countries and may result in the same rent seeking behavior as documented in the curse of natural resources literature. In this paper we discuss this effect and document its magnitude. Using data for 108 recipient countries in the period 1960 to 1999, we find that foreign aid has a negative impact on democracy. In particular, if the foreign aid over GDP that a country receives over a period of five years reaches the 75th percentile in the sample, then a 10-point index of democracy is reduced between 0.6 and one point, a large effect. For comparison, we also measure the effect of oil rents on political institutions. The fall in democracy if oil revenues reach the 75th percentile is smaller, (0.02). Aid is a bigger curse than oil.
Resumo:
Abstract:This article illustrates Angela Carter's literary practice through her utilization of "Sleeping Beauty" in the radio play Vampirella and its prose variation The Lady of the House of Love. It argues that she vampirised European culture as she transfused old stories into new bodies to give them new life and bite. Carter's experiments with forms, genres and mediums in her vampire fiction capture the inherent hybridity of the fairy tale as it sheds new light on her main source, Charles Perrault's La Belle au bois dormant, bringing to the fore the horror and terror as well as the textual ambiguities of the French conte that were gradually obscured in favor of the romance element. Carter's vampire stories thus trace the 'dark' underside of the reception of the tale in Gothic fiction and in the subculture of comic books and Hammer films so popular in the 1970s, where the Sleeping Beauty figure is revived as a femme fatale or vamp who takes her fate in her own hands.Résumé:Cet article s'attache à montrer comment l'utilisation de La Belle au bois dormant dans deux histoires de vampire d'Angela Carter, la pièce radiophonique Vampirella et sa réécriture en prose The Lady of the House of Love, illustre la pratique littéraire de l'auteur, qui consiste à vampiriser la culture européenne et à transfuser les vieilles histoires dans de nouvelles formes, genres, et médias afin de leur donner une nouvelle vie. Le traitement du conte de fée permet d'aborder un aspect essentiel de la démarche créative de l'auteur, tout en offrant un éclairage inédit sur le conte de Perrault. En effet, Carter met en évidence les éléments inquiétants et l'atmosphère de menace qui caractérisent la deuxième partie du conte, tout en jouant sur les ambiguités du texte français souvent négligés au profit de la veine romanesque. A cet égard, ses histoires de vampire peuvent se lire comme une réflexion sur la réception 'obscure' du conte de fées dans la culture populaire, qui voit le personnage de la Belle au bois dormant prendre son destin en main et se réinventer en femme fatale ou vamp dans la bande dessinée et les séries B des années 1970.
Resumo:
Ce mémoire de maîtrise présente une nouvelle approche non supervisée pour détecter et segmenter les régions urbaines dans les images hyperspectrales. La méthode proposée n ́ecessite trois étapes. Tout d’abord, afin de réduire le coût calculatoire de notre algorithme, une image couleur du contenu spectral est estimée. A cette fin, une étape de réduction de dimensionalité non-linéaire, basée sur deux critères complémentaires mais contradictoires de bonne visualisation; à savoir la précision et le contraste, est réalisée pour l’affichage couleur de chaque image hyperspectrale. Ensuite, pour discriminer les régions urbaines des régions non urbaines, la seconde étape consiste à extraire quelques caractéristiques discriminantes (et complémentaires) sur cette image hyperspectrale couleur. A cette fin, nous avons extrait une série de paramètres discriminants pour décrire les caractéristiques d’une zone urbaine, principalement composée d’objets manufacturés de formes simples g ́eométriques et régulières. Nous avons utilisé des caractéristiques texturales basées sur les niveaux de gris, la magnitude du gradient ou des paramètres issus de la matrice de co-occurrence combinés avec des caractéristiques structurelles basées sur l’orientation locale du gradient de l’image et la détection locale de segments de droites. Afin de réduire encore la complexité de calcul de notre approche et éviter le problème de la ”malédiction de la dimensionnalité” quand on décide de regrouper des données de dimensions élevées, nous avons décidé de classifier individuellement, dans la dernière étape, chaque caractéristique texturale ou structurelle avec une simple procédure de K-moyennes et ensuite de combiner ces segmentations grossières, obtenues à faible coût, avec un modèle efficace de fusion de cartes de segmentations. Les expérimentations données dans ce rapport montrent que cette stratégie est efficace visuellement et se compare favorablement aux autres méthodes de détection et segmentation de zones urbaines à partir d’images hyperspectrales.
Resumo:
This report explores how recurrent neural networks can be exploited for learning high-dimensional mappings. Since recurrent networks are as powerful as Turing machines, an interesting question is how recurrent networks can be used to simplify the problem of learning from examples. The main problem with learning high-dimensional functions is the curse of dimensionality which roughly states that the number of examples needed to learn a function increases exponentially with input dimension. This thesis proposes a way of avoiding this problem by using a recurrent network to decompose a high-dimensional function into many lower dimensional functions connected in a feedback loop.
Resumo:
In this paper we consider the problem of approximating a function belonging to some funtion space Φ by a linear comination of n translates of a given function G. Ussing a lemma by Jones (1990) and Barron (1991) we show that it is possible to define function spaces and functions G for which the rate of convergence to zero of the erro is 0(1/n) in any number of dimensions. The apparent avoidance of the "curse of dimensionality" is due to the fact that these function spaces are more and more constrained as the dimension increases. Examples include spaces of the Sobolev tpe, in which the number of weak derivatives is required to be larger than the number of dimensions. We give results both for approximation in the L2 norm and in the Lc norm. The interesting feature of these results is that, thanks to the constructive nature of Jones" and Barron"s lemma, an iterative procedure is defined that can achieve this rate.
Resumo:
As an alternative to the present system of intermediation of the German savings surplus, this paper suggests that the risk-adjusted rate of return could be improved by creating a sovereign wealth fund for Germany (designated DESWF), which could invest excess German savings globally. Such a DESWF would offer German savers a secure vehicle paying a guaranteed positive minimum real interest rate, with a top-up when real investment returns allowed. The vehicle would invest the funds in a portfolio that is highly diversified by geography and asset classes. Positive real returns can be expected in the long run based on positive real global growth. Since, in this case, a significant amount of funds would flow outside the euro area, the euro would depreciate, which would help crisis countries presently struggling to revive growth through exports and to close their external deficits so as to recoup their international credit-worthiness. Target imbalances would gradually disappear and German claims abroad would move from nominal claims on the ECB to diversified real and nominal claims on various private and public foreign entities in a variety of asset classes.
Resumo:
A connection between a fuzzy neural network model with the mixture of experts network (MEN) modelling approach is established. Based on this linkage, two new neuro-fuzzy MEN construction algorithms are proposed to overcome the curse of dimensionality that is inherent in the majority of associative memory networks and/or other rule based systems. The first construction algorithm employs a function selection manager module in an MEN system. The second construction algorithm is based on a new parallel learning algorithm in which each model rule is trained independently, for which the parameter convergence property of the new learning method is established. As with the first approach, an expert selection criterion is utilised in this algorithm. These two construction methods are equivalent in their effectiveness in overcoming the curse of dimensionality by reducing the dimensionality of the regression vector, but the latter has the additional computational advantage of parallel processing. The proposed algorithms are analysed for effectiveness followed by numerical examples to illustrate their efficacy for some difficult data based modelling problems.
Resumo:
A common problem in many data based modelling algorithms such as associative memory networks is the problem of the curse of dimensionality. In this paper, a new two-stage neurofuzzy system design and construction algorithm (NeuDeC) for nonlinear dynamical processes is introduced to effectively tackle this problem. A new simple preprocessing method is initially derived and applied to reduce the rule base, followed by a fine model detection process based on the reduced rule set by using forward orthogonal least squares model structure detection. In both stages, new A-optimality experimental design-based criteria we used. In the preprocessing stage, a lower bound of the A-optimality design criterion is derived and applied as a subset selection metric, but in the later stage, the A-optimality design criterion is incorporated into a new composite cost function that minimises model prediction error as well as penalises the model parameter variance. The utilisation of NeuDeC leads to unbiased model parameters with low parameter variance and the additional benefit of a parsimonious model structure. Numerical examples are included to demonstrate the effectiveness of this new modelling approach for high dimensional inputs.
Resumo:
An input variable selection procedure is introduced for the identification and construction of multi-input multi-output (MIMO) neurofuzzy operating point dependent models. The algorithm is an extension of a forward modified Gram-Schmidt orthogonal least squares procedure for a linear model structure which is modified to accommodate nonlinear system modeling by incorporating piecewise locally linear model fitting. The proposed input nodes selection procedure effectively tackles the problem of the curse of dimensionality associated with lattice-based modeling algorithms such as radial basis function neurofuzzy networks, enabling the resulting neurofuzzy operating point dependent model to be widely applied in control and estimation. Some numerical examples are given to demonstrate the effectiveness of the proposed construction algorithm.