19 resultados para Complex variable theory

em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Economies are open complex adaptive systems far from thermodynamic equilibrium, and neo-classical environmental economics seems not to be the best way to describe the behaviour of such systems. Standard econometric analysis (i.e. time series) takes a deterministic and predictive approach, which encourages the search for predictive policy to ‘correct’ environmental problems. Rather, it seems that, because of the characteristics of economic systems, an ex-post analysis is more appropriate, which describes the emergence of such systems’ properties, and which sees policy as a social steering mechanism. With this background, some of the recent empirical work published in the field of ecological economics that follows the approach defended here is presented. Finally, the conclusion is reached that a predictive use of econometrics (i.e. time series analysis) in ecological economics should be limited to cases in which uncertainty decreases, which is not the normal situation when analysing the evolution of economic systems. However, that does not mean we should not use empirical analysis. On the contrary, this is to be encouraged, but from a structural and ex-post point of view.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We describe one of the research lines of the Grup de Teoria de Funcions de la UAB UB, which deals with sampling and interpolation problems in signal analysis and their connections with complex function theory.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The main argument developed here is the proposal of the concept of “Social Multi-Criteria Evaluation” (SMCE) as a possible useful framework for the application of social choice to the difficult policy problems of our Millennium, where, as stated by Funtowicz and Ravetz, “facts are uncertain, values in dispute, stakes high and decisions urgent”. This paper starts from the following main questions: 1. Why “Social” Multi-criteria Evaluation? 2. How such an approach should be developed? The foundations of SMCE are set up by referring to concepts coming from complex system theory and philosophy, such as reflexive complexity, post-normal science and incommensurability. To give some operational guidelines on the application of SMCE basic questions to be answered are: 1. How is it possible to deal with technical incommensurability? 2. How can we deal with the issue of social incommensurability? To answer these questions, by using theoretical considerations and lessons learned from realworld case studies, is the main objective of the present article.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper I review a series of theoretical concepts that are relevant for the integrated assessment of agricultural sustainability but that are not generally included in the curriculum of the various scientific disciplines dealing with quantitative analysis of agriculture. I first illustrate with plain narratives and concrete examples that sustainability is an extremely complex issue requiring the simultaneous consideration of several aspects, which cannot be reduced into a single indicator of performance. Following, I justify this obvious need for multi-criteria analysis with theoretical concepts dealing with the epistemological predicament of complexity, starting from classic philosophical lessons to arrive to recent developments in complex system theory, in particular Rosen´s theory of modelling relation which is essential to analyze the quality of any quantitative representation. The implications of these theoretical concepts are then illustrated with applications of multi-criteria analysis to the sustainability of agriculture. I wrap up by pointing out the crucial difference between "integrated assessment" and "integrated analysis". An integrated analysis is a set of indicators and analytical models generating an analytical output. An integrated assessment is much more than that. It is about finding an effective way to deal with three key issues: (i) legitimacy – how to handle the unavoidable existence of legitimate but contrasting points of view about different meanings given by social actors to the word "development"; (ii) pertinence – how to handle in a coherent way scientific analyses referring to different scales and dimensions; and (iii) credibility – how to handle the unavoidable existence of uncertainty and genuine ignorance, when dealing with the analysis of future scenarios.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper we study the disability transition probabilities (as well as the mortalityprobabilities) due to concurrent factors to age such as income, gender and education. Althoughit is well known that ageing and socioeconomic status influence the probability ofcausing functional disorders, surprisingly little attention has been paid to the combined effectof those factors along the individuals' life and how this affects the transition from one degreeof disability to another. The assumption that tomorrow's disability state is only a functionof the today's state is very strong, since disability is a complex variable that depends onseveral other elements than time. This paper contributes into the field in two ways: (1) byattending the distinction between the initial disability level and the process that leads tohis course (2) by addressing whether and how education, age and income differentially affectthe disability transitions. Using a Markov chain discrete model and a survival analysis, weestimate the probability by year and individual characteristics that changes the state of disabilityand the duration that it takes its progression in each case. We find that people withan initial state of disability have a higher propensity to change and take less time to transitfrom different stages. Men do that more frequently than women. Education and incomehave negative effects on transition. Moreover, we consider the disability benefits associatedto those changes along different stages of disability and therefore we offer some clues onthe potential savings of preventive actions that may delay or avoid those transitions. Onpure cost considerations, preventive programs for improvement show higher benefits thanthose for preventing deterioration, and in general terms, those focussing individuals below65 should go first. Finally the trend of disability in Spain seems not to change among yearsand regional differences are not found.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Bimodal dispersal probability distributions with characteristic distances differing by several orders of magnitude have been derived and favorably compared to observations by Nathan [Nature (London) 418, 409 (2002)]. For such bimodal kernels, we show that two-dimensional molecular dynamics computer simulations are unable to yield accurate front speeds. Analytically, the usual continuous-space random walks (CSRWs) are applied to two dimensions. We also introduce discrete-space random walks and use them to check the CSRW results (because of the inefficiency of the numerical simulations). The physical results reported are shown to predict front speeds high enough to possibly explain Reid's paradox of rapid tree migration. We also show that, for a time-ordered evolution equation, fronts are always slower in two dimensions than in one dimension and that this difference is important both for unimodal and for bimodal kernels

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Inductive learning aims at finding general rules that hold true in a database. Targeted learning seeks rules for the predictions of the value of a variable based on the values of others, as in the case of linear or non-parametric regression analysis. Non-targeted learning finds regularities without a specific prediction goal. We model the product of non-targeted learning as rules that state that a certain phenomenon never happens, or that certain conditions necessitate another. For all types of rules, there is a trade-off between the rule's accuracy and its simplicity. Thus rule selection can be viewed as a choice problem, among pairs of degree of accuracy and degree of complexity. However, one cannot in general tell what is the feasible set in the accuracy-complexity space. Formally, we show that finding out whether a point belongs to this set is computationally hard. In particular, in the context of linear regression, finding a small set of variables that obtain a certain value of R2 is computationally hard. Computational complexity may explain why a person is not always aware of rules that, if asked, she would find valid. This, in turn, may explain why one can change other people's minds (opinions, beliefs) without providing new information.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Es presenta un nou algorisme per a la diagonalització de matrius amb diagonal dominant. Es mostra la seva eficàcia en el tractament de matrius no simètriques, amb elements definits sobre el cos complex i, fins i tot, de grans dimensions. Es posa de manifest la senzillesa del mètode així com la facilitat d'implementació en forma de codi de programació. Es comentenels seus avantatges i característiques limitants, així com algunes de les millores que es poden implementar. Finalment, es mostren alguns exemples numèrics

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Most studies analysing the infrastructure impact on regional growth show a positive relationship between both variables. However, the public capital elasticity estimated in a Cobb-Douglas function, which is the most common specification in these works, is sometimes too big to be credible, so that the results have been partially desestimated. In the present paper, we give some new advances on the real link between public capital and productivity for the Spanish regions in the period 1964-1991. Firstly, we find out that the association for both variables is smaller when controlling for regional effects, being industry the sector which reaps the most benefits from an increase in the infrastructural dotation. Secondly, concerning to the rigidity of the Cobb-Douglas function, it is surpassed by using the variable expansion method. The expanded functional form reveals both the absence of a direct effect of infrastructure and the fact that the link between infrastructure and growth depends on the level of the existing stock (threshold level) and the way infrastructure is articulated in its location relative to other factors. Finally, we analyse the importance of the spatial dimension in infrastructure impact, due to spillover effects. In this sense, the paper provides evidence of the existence of spatial autocorrelation processes that may invalidate previous results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Most studies analysing the infrastructure impact on regional growth show a positive relationship between both variables. However, the public capital elasticity estimated in a Cobb-Douglas function, which is the most common specification in these works, is sometimes too big to be credible, so that the results have been partially desestimated. In the present paper, we give some new advances on the real link between public capital and productivity for the Spanish regions in the period 1964-1991. Firstly, we find out that the association for both variables is smaller when controlling for regional effects, being industry the sector which reaps the most benefits from an increase in the infrastructural dotation. Secondly, concerning to the rigidity of the Cobb-Douglas function, it is surpassed by using the variable expansion method. The expanded functional form reveals both the absence of a direct effect of infrastructure and the fact that the link between infrastructure and growth depends on the level of the existing stock (threshold level) and the way infrastructure is articulated in its location relative to other factors. Finally, we analyse the importance of the spatial dimension in infrastructure impact, due to spillover effects. In this sense, the paper provides evidence of the existence of spatial autocorrelation processes that may invalidate previous results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is very well known that the first succesful valuation of a stock option was done by solving a deterministic partial differential equation (PDE) of the parabolic type with some complementary conditions specific for the option. In this approach, the randomness in the option value process is eliminated through a no-arbitrage argument. An alternative approach is to construct a replicating portfolio for the option. From this viewpoint the payoff function for the option is a random process which, under a new probabilistic measure, turns out to be of a special type, a martingale. Accordingly, the value of the replicating portfolio (equivalently, of the option) is calculated as an expectation, with respect to this new measure, of the discounted value of the payoff function. Since the expectation is, by definition, an integral, its calculation can be made simpler by resorting to powerful methods already available in the theory of analytic functions. In this paper we use precisely two of those techniques to find the well-known value of a European call

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We investigate adsorption of helium in nanoscopic polygonal pores at zero temperature using a finite-range density functional theory. The adsorption potential is computed by means of a technique denoted as the elementary source method. We analyze a rhombic pore with Cs walls, where we show the existence of multiple interfacial configurations at some linear densities, which correspond to metastable states. Shape transitions and hysterectic loops appear in patterns which are richer and more complex than in a cylindrical tube with the same transverse area.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We discuss reality conditions and the relation between spacetime diffeomorphisms and gauge transformations in Ashtekars complex formulation of general relativity. We produce a general theoretical framework for the stabilization algorithm for the reality conditions, which is different from Diracs method of stabilization of constraints. We solve the problem of the projectability of the diffeomorphism transformations from configuration-velocity space to phase space, linking them to the reality conditions. We construct the complete set of canonical generators of the gauge group in the phase space which includes all the gauge variables. This result proves that the canonical formalism has all the gauge structure of the Lagrangian theory, including the time diffeomorphisms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We demonstrate that the self-similarity of some scale-free networks with respect to a simple degree-thresholding renormalization scheme finds a natural interpretation in the assumption that network nodes exist in hidden metric spaces. Clustering, i.e., cycles of length three, plays a crucial role in this framework as a topological reflection of the triangle inequality in the hidden geometry. We prove that a class of hidden variable models with underlying metric spaces are able to accurately reproduce the self-similarity properties that we measured in the real networks. Our findings indicate that hidden geometries underlying these real networks are a plausible explanation for their observed topologies and, in particular, for their self-similarity with respect to the degree-based renormalization.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is very well known that the first succesful valuation of a stock option was done by solving a deterministic partial differential equation (PDE) of the parabolic type with some complementary conditions specific for the option. In this approach, the randomness in the option value process is eliminated through a no-arbitrage argument. An alternative approach is to construct a replicating portfolio for the option. From this viewpoint the payoff function for the option is a random process which, under a new probabilistic measure, turns out to be of a special type, a martingale. Accordingly, the value of the replicating portfolio (equivalently, of the option) is calculated as an expectation, with respect to this new measure, of the discounted value of the payoff function. Since the expectation is, by definition, an integral, its calculation can be made simpler by resorting to powerful methods already available in the theory of analytic functions. In this paper we use precisely two of those techniques to find the well-known value of a European call