41 resultados para Analytical approximations
em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain
Resumo:
This technical report is a document prepared as a deliverable [D4.3 Report of the Interlinkages and forecasting prototype tool] of a EU project – DECOIN Project No. 044428 - FP6-2005-SSP-5A. The text is divided into 4 sections: (1) this short introductory section explains the purpose of the report; (2) the second section provides a general discussion of a systemic problem found in existing quantitative analysis of sustainability. It addresses the epistemological implications of complexity, which entails the need of dealing with the existence of Multiple-Scales and non-equivalent narratives (multiple dimensions/attributes) to be used to define sustainability issues. There is an unavoidable tension between a “steady-state view” (= the perception of what is going on now – reflecting a PAST --& PRESENT view of the reality) versus an “evolutionary view” (= the unknown transformation that we have to expect in the process of becoming of the observed reality and in the observer – reflecting a PRESENT --& FUTURE view of the reality). The section ends by listing the implications of these points on the choice of integrated packages of sustainability indicators; (3) the third section illustrates the potentiality of the DECOIN toolkit for the study of sustainability trade-offs and linkages across indicators using quantitative examples taken from cases study of another EU project (SMILE). In particular, this section starts by addressing the existence of internal constraints to sustainability (economic versus social aspects). The narrative chosen for this discussion focuses on the dark side of ageing and immigration on the economic viability of social systems. Then the section continues by exploring external constraints to sustainability (economic development vs the environment). The narrative chosen for this discussion focuses on the dark side of current strategy of economic development based on externalization and the “bubbles-disease”; (4) the last section presents a critical appraisal of the quality of energy data found in energy statistics. It starts with a discussion of the general goal of statistical accounting. Then it introduces the concept of multipurpose grammars. The second part uses the experience made in the activities of the DECOIN project to answer the question: how useful are EUROSTAT energy statistics? The answer starts with an analysis of basic epistemological problems associated with accounting of energy. This discussion leads to the acknowledgment of an important epistemological problem: the unavoidable bifurcations in the mechanism of accounting needed to generate energy statistics. By using numerical example the text deals with the following issues: (i) the pitfalls of the actual system of accounting in energy statistics; (ii) a critical appraisal of the actual system of accounting in BP statistics; (iii) a critical appraisal of the actual system of accounting in Eurostat statistics. The section ends by proposing an innovative method to represent energy statistics which can result more useful for those willing develop sustainability indicators.
Resumo:
We show that nuclear C*-algebras have a re ned version of the completely positive approximation property, in which the maps that approximately factorize through finite dimensional algebras are convex combinations of order zero maps. We use this to show that a separable nuclear C*-algebra A which is closely contained in a C*-algebra B embeds into B.
Resumo:
We derive analytical expressions for the propagation speed of downward combustion fronts of thin solid fuels with a background flow initially at rest. The classical combustion model for thin solid fuels that consists of five coupled reaction-convection-diffusion equations is here reduced into a single equation with the gas temperature as the single variable. For doing so we apply a two-zone combustion model that divides the system into a preheating region and a pyrolyzing region. The speed of the combustion front is obtained after matching the temperature and its derivative at the location that separates both regions.We also derive a simplified version of this analytical expression expected to be valid for a wide range of cases. Flame front velocities predicted by our analyticalexpressions agree well with experimental data found in the literature for a large variety of cases and substantially improve the results obtained from a previous well-known analytical expression
Resumo:
A model-based approach for fault diagnosis is proposed, where the fault detection is based on checking the consistencyof the Analytical Redundancy Relations (ARRs) using an interval tool. The tool takes into account the uncertainty in theparameters and the measurements using intervals. Faults are explicitly included in the model, which allows for the exploitation of additional information. This information is obtained from partial derivatives computed from the ARRs. The signs in the residuals are used to prune the candidate space when performing the fault diagnosis task. The method is illustrated using a two-tank example, in which these aspects are shown to have an impact on the diagnosis and fault discrimination, since the proposed method goes beyond the structural methods
Resumo:
This paper derives approximations allowing the estimation of outage probability for standard irregular LDPC codes and full-diversity Root-LDPC codes used over nonergodic block-fading channels. Two separate approaches are discussed: a numerical approximation, obtained by curve fitting, for both code ensembles, and an analytical approximation for Root-LDPC codes, obtained under the assumption that the slope of the iterative threshold curve of a given code ensemble matches the slope of the outage capacity curve in the high-SNR regime.
Resumo:
We introduce a variation of the proof for weak approximations that issuitable for studying the densities of stochastic processes which areevaluations of the flow generated by a stochastic differential equation on a random variable that maybe anticipating. Our main assumption is that the process and the initial random variable have to be smooth in the Malliavin sense. Furthermore if the inverse of the Malliavin covariance matrix associated with the process under consideration is sufficiently integrable then approximations fordensities and distributions can also be achieved. We apply theseideas to the case of stochastic differential equations with boundaryconditions and the composition of two diffusions.
Resumo:
In applied regional analysis, statistical information is usually published at different territorial levels with the aim providing inforamtion of interest for different potential users. When using this information, there are two different choices: first, to use normative regions ( towns, provinces, etc.) or, second, to design analytical regions directly related with the analysed phenomena. In this paper, privincial time series of unemployment rates in Spain are used in order to compare the results obtained by applying yoy analytical regionalisation models ( a two stages procedure based on cluster analysis and a procedure based on mathematical programming) with the normative regions available at two different scales: NUTS II and NUTS I. The results have shown that more homogeneous regions were designed when applying both analytical regionalisation tools. Two other obtained interesting results are related with the fact that analytical regions were also more estable along time and with the effects of scales in the regionalisation process
Resumo:
In applied regional analysis, statistical information is usually published at different territorial levels with the aim providing inforamtion of interest for different potential users. When using this information, there are two different choices: first, to use normative regions ( towns, provinces, etc.) or, second, to design analytical regions directly related with the analysed phenomena. In this paper, privincial time series of unemployment rates in Spain are used in order to compare the results obtained by applying yoy analytical regionalisation models ( a two stages procedure based on cluster analysis and a procedure based on mathematical programming) with the normative regions available at two different scales: NUTS II and NUTS I. The results have shown that more homogeneous regions were designed when applying both analytical regionalisation tools. Two other obtained interesting results are related with the fact that analytical regions were also more estable along time and with the effects of scales in the regionalisation process
Resumo:
[cat] En aquest article estudiem estratègies “comprar i mantenir” per a problemes d’optimitzar la riquesa final en un context multi-període. Com que la riquesa final és una suma de variables aleatòries dependents, on cadascuna d’aquestes correspon a una quantitat de capital que s’ha invertit en un actiu particular en una data determinada, en primer lloc considerem aproximacions que redueixen l’aleatorietat multivariant al cas univariant. A continuació, aquestes aproximacions es fan servir per determinar les estratègies “comprar i mantenir” que optimitzen, per a un nivell de probabilitat donat, el VaR i el CLTE de la funció de distribució de la riquesa final. Aquest article complementa el treball de Dhaene et al. (2005), on es van considerar estratègies de reequilibri constant.
Resumo:
The real part of the optical potential for heavy ion elastic scattering is obtained by double folding of the nuclear densities with a density-dependent nucleon-nucleon effective interaction which was successful in describing the binding, size, and nucleon separation energies in spherical nuclei. A simple analytical form is found to differ from the resulting potential considerably less than 1% all through the important region. This analytical potential is used so that only few points of the folding need to be computed. With an imaginary part of the Woods-Saxon type, this potential predicts the elastic scattering angular distribution in very good agreement with experimental data, and little renormalization (unity in most cases) is needed.