16 resultados para subgrid-scale models

em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Postprint (published version)

Relevância:

80.00% 80.00%

Publicador:

Resumo:

El Parc Natural de l’Alt Pirineu (PNAP) va ser creat l’any 2003. Actualment, el parc està desenvolupant una xarxa d’itineraris d’Educació Ambiental (EA). L’Ecomuseu de la Vall d’Àneu (EVA) ofereix rutes guiades al sender del Monestir de Sant Pere del Burgal, ja senyalitzat i equipat pel PNAP, en estar inscrit al seu àmbit territorial. Es tracta d’un sender de fàcil accés i recorregut, molt ample al primer tram tot oferint una gran varietat d’aspectes d’interès. L’objectiu principal del present projecte és plantejar un itinerari d’EA sensorial adaptat als col·lectius amb mobilitat reduïda i persones invidents. Amb aquesta finalitat es desenvolupen continguts i materials didàctics i es determinen les accions que els articularan. En un primer moment s'ha analitzat la viabilitat de l’itinerari aplicant el protocol de valoració dissenyat pel grup Edukamb. La puntuació obtinguda és de 74 punts sobre 100, corroborant la idoneïtat del seu recorregut pels visitants. En el disseny de l’itinerari, s’han determinat els elements i processos d’interès a l’entorn, s’han proposat quatre parades sensorials i una pasarel·la de fusta al primer tram i quatre parades de component antropològica i la instal·lació d’una corda perimetral al segon tram. Finalment també, la instal·lació de maquetes tridimensionals tàctils, una descriptiva dels aspectes i les parades de l’itinerari a l’inici del camí i una arquitectònica de l’entorn del monestir en arrivar al mateix. S’han proposat millores en la senyalització present, alternatives i complements al material pedagògic considerat al projecte i el disseny de protocols de valoració per itineraris adaptats a tot tipus de col·lectius.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper we examine the effect of tax policy on the relationship between inequality and growth in a two-sector non-scale model. With non-scale models, the longrun equilibrium growth rate is determined by technological parameters and it is independent of macroeconomic policy instruments. However, this fact does not imply that fiscal policy is unimportant for long-run economic performance. It indeed has important effects on the different levels of key economic variables such as per capita stock of capital and output. Hence, although the economy grows at the same rate across steady states, the bases for economic growth may be different.The model has three essential features. First, we explicitly model skill accumulation, second, we introduce government finance into the production function, and we introduce an income tax to mirror the fiscal events of the 1980¿s and 1990¿s in the US. The fact that the non-scale model is associated with higher order dynamics enables it to replicate the distinctly non-linear nature of inequality in the US with relative ease. The results derived in this paper attract attention to the fact that the non-scale growth model does not only fit the US data well for the long-run (Jones, 1995b) but also that it possesses unique abilities in explaining short term fluctuations of the economy. It is shown that during transition the response of the relative simulated wage to changes in the tax code is rather non-monotonic, quite in accordance to the US inequality pattern in the 1980¿s and early 1990¿s.More specifically, we have analyzed in detail the dynamics following the simulation of an isolated tax decrease and an isolated tax increase. So, after a tax decrease the skill premium follows a lower trajectory than the one it would follow without a tax decrease. Hence we are able to reduce inequality for several periods after the fiscal shock. On the contrary, following a tax increase, the evolution of the skill premium remains above the trajectory carried on by the skill premium under a situation with no tax increase. Consequently, a tax increase would imply a higher level of inequality in the economy

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper we examine the effect of tax policy on the relationship between inequality and growth in a two-sector non-scale model. With non-scale models, the longrun equilibrium growth rate is determined by technological parameters and it is independent of macroeconomic policy instruments. However, this fact does not imply that fiscal policy is unimportant for long-run economic performance. It indeed has important effects on the different levels of key economic variables such as per capita stock of capital and output. Hence, although the economy grows at the same rate across steady states, the bases for economic growth may be different.The model has three essential features. First, we explicitly model skill accumulation, second, we introduce government finance into the production function, and we introduce an income tax to mirror the fiscal events of the 1980¿s and 1990¿s in the US. The fact that the non-scale model is associated with higher order dynamics enables it to replicate the distinctly non-linear nature of inequality in the US with relative ease. The results derived in this paper attract attention to the fact that the non-scale growth model does not only fit the US data well for the long-run (Jones, 1995b) but also that it possesses unique abilities in explaining short term fluctuations of the economy. It is shown that during transition the response of the relative simulated wage to changes in the tax code is rather non-monotonic, quite in accordance to the US inequality pattern in the 1980¿s and early 1990¿s.More specifically, we have analyzed in detail the dynamics following the simulation of an isolated tax decrease and an isolated tax increase. So, after a tax decrease the skill premium follows a lower trajectory than the one it would follow without a tax decrease. Hence we are able to reduce inequality for several periods after the fiscal shock. On the contrary, following a tax increase, the evolution of the skill premium remains above the trajectory carried on by the skill premium under a situation with no tax increase. Consequently, a tax increase would imply a higher level of inequality in the economy

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider two fundamental properties in the analysis of two-way tables of positive data: the principle of distributional equivalence, one of the cornerstones of correspondence analysis of contingency tables, and the principle of subcompositional coherence, which forms the basis of compositional data analysis. For an analysis to be subcompositionally coherent, it suffices to analyse the ratios of the data values. The usual approach to dimension reduction in compositional data analysis is to perform principal component analysis on the logarithms of ratios, but this method does not obey the principle of distributional equivalence. We show that by introducing weights for the rows and columns, the method achieves this desirable property. This weighted log-ratio analysis is theoretically equivalent to spectral mapping , a multivariate method developed almost 30 years ago for displaying ratio-scale data from biological activity spectra. The close relationship between spectral mapping and correspondence analysis is also explained, as well as their connection with association modelling. The weighted log-ratio methodology is applied here to frequency data in linguistics and to chemical compositional data in archaeology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this paper is twofold: firstly, to carry out a theoreticalreview of the most recent stated preference techniques used foreliciting consumers preferences and, secondly, to compare the empiricalresults of two dierent stated preference discrete choice approaches.They dier in the measurement scale for the dependent variable and,therefore, in the estimation method, despite both using a multinomiallogit. One of the approaches uses a complete ranking of full-profiles(contingent ranking), that is, individuals must rank a set ofalternatives from the most to the least preferred, and the other usesa first-choice rule in which individuals must select the most preferredoption from a choice set (choice experiment). From the results werealize how important the measurement scale for the dependent variablebecomes and, to what extent, procedure invariance is satisfied.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The intensity correlation functions C(t) for the colored-gain-noise model of dye lasers are analyzed and compared with those for the loss-noise model. For correlation times ¿ larger than the deterministic relaxation time td, we show with the use of the adiabatic approximation that C(t) values coincide for both models. For small correlation times we use a method that provides explicit expressions of non-Markovian correlation functions, approximating simultaneously short- and long-time behaviors. Comparison with numerical simulations shows excellent results simultaneously for short- and long-time regimes. It is found that, when the correlation time of the noise increases, differences between the gain- and loss-noise models tend to disappear. The decay of C(t) for both models can be described by a time scale that approaches the deterministic relaxation time. However, in contrast with the loss-noise model, a secondary time scale remains for large times for the gain-noise model, which could allow one to distinguish between both models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider the classical stochastic fluctuations of spacetime geometry induced by quantum fluctuations of massless nonconformal matter fields in the early Universe. To this end, we supplement the stress-energy tensor of these fields with a stochastic part, which is computed along the lines of the Feynman-Vernon and Schwinger-Keldysh techniques; the Einstein equation is therefore upgraded to a so-called Einstein-Langevin equation. We consider in some detail the conformal fluctuations of flat spacetime and the fluctuations of the scale factor in a simple cosmological model introduced by Hartle, which consists of a spatially flat isotropic cosmology driven by radiation and dust.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We numerically study the dynamical properties of fully frustrated models in two and three dimensions. The results obtained support the hypothesis that the percolation transition of the Kasteleyn-Fortuin clusters corresponds to the onset of stretched exponential autocorrelation functions in systems without disorder. This dynamical behavior may be due to the large scale effects of frustration, present below the percolation threshold. Moreover, these results are consistent with the picture suggested by Campbell et al. [J. Phys. C 20, L47 (1987)] in the space of configurations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Uncorrelated random scale-free networks are useful null models to check the accuracy and the analytical solutions of dynamical processes defined on complex networks. We propose and analyze a model capable of generating random uncorrelated scale-free networks with no multiple and self-connections. The model is based on the classical configuration model, with an additional restriction on the maximum possible degree of the vertices. We check numerically that the proposed model indeed generates scale-free networks with no two- and three-vertex correlations, as measured by the average degree of the nearest neighbors and the clustering coefficient of the vertices of degree k, respectively.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The prediction of rockfall travel distance below a rock cliff is an indispensable activity in rockfall susceptibility, hazard and risk assessment. Although the size of the detached rock mass may differ considerably at each specific rock cliff, small rockfall (<100 m3) is the most frequent process. Empirical models may provide us with suitable information for predicting the travel distance of small rockfalls over an extensive area at a medium scale (1:100 000¿1:25 000). "Solà d'Andorra la Vella" is a rocky slope located close to the town of Andorra la Vella, where the government has been documenting rockfalls since 1999. This documentation consists in mapping the release point and the individual fallen blocks immediately after the event. The documentation of historical rockfalls by morphological analysis, eye-witness accounts and historical images serve to increase available information. In total, data from twenty small rockfalls have been gathered which reveal an amount of a hundred individual fallen rock blocks. The data acquired has been used to check the reliability of the main empirical models widely adopted (reach and shadow angle models) and to analyse the influence of parameters which affecting the travel distance (rockfall size, height of fall along the rock cliff and volume of the individual fallen rock block). For predicting travel distances in maps with medium scales, a method has been proposed based on the "reach probability" concept. The accuracy of results has been tested from the line entailing the farthest fallen boulders which represents the maximum travel distance of past rockfalls. The paper concludes with a discussion of the application of both empirical models to other study areas.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Daily precipitation is recorded as the total amount of water collected by a rain-gauge in 24h. Events are modelled as a Poisson process and the 24h precipitation by a Generalized Pareto Distribution (GPD) of excesses. Hazard assessment is complete when estimates of the Poisson rate and the distribution parameters, together with a measure of their uncertainty, are obtained. The shape parameter of the GPD determines the support of the variable: Weibull domain of attraction (DA) corresponds to finite support variables, as should be for natural phenomena. However, Fréchet DA has been reported for daily precipitation, which implies an infinite support and a heavy-tailed distribution. We use the fact that a log-scale is better suited to the type of variable analyzed to overcome this inconsistency, thus showing that using the appropriate natural scale can be extremely important for proper hazard assessment. The approach is illustrated with precipitation data from the Eastern coast of the Iberian Peninsula affected by severe convective precipitation. The estimation is carried out by using Bayesian techniques

Relevância:

30.00% 30.00%

Publicador:

Resumo:

S u b s u r face fluid flow plays a significant role in many geologic processes and is increasingly being studied in the scale of sedimentary basins and geologic time perspective. Many economic resources such as petroleum and mineral deposits are products of basin scale fluid flow operating over large periods of time. Such ancient flow systems can be studied through analysis of diagenetic alterations and fluid inclusions to constrain physical and chemical conditions of fluids and rocks during their paleohy d r og e o l ogic evolution. Basin simulation models are useful to complement the paleohy d r og e o l ogic record preserved in the rocks and to derive conceptual models on hydraulic basin evolution and generation of economic resources. Different types of fluid flow regimes may evo l ve during basin evolution. The most important with respect to flow rates and capacity for transport of solutes and thermal energy is gr avitational fluid flow driven by the topographic configuration of a basin. Such flow systems require the basin to be elevated above sea level. Consolidational fluid flow is the principal fluid migration process in basins below sea level, caused by loading of compressible rocks. Flow rates of such systems are several orders of magnitude below topogr a p hy driven flow. Howeve r, consolidation may create significant fluid ove rpressure. Episodic dewatering of ove rpressured compart m e n t s m ay cause sudden fluid release with elevated flow velocities and may cause a transient local thermal and chemical disequilibrium betwe e n fluid and rock. This paper gives an ove rv i ew on subsurface fluid flow processes at basin scale and presents examples related to the Pe n e d è s basin in the central Catalan continental margin including the offshore Barcelona half-graben and the compressive South-Pyrenean basin.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many educators and educational institutions have yet to integrate web-based practices into their classrooms and curricula. As a result, it can be difficult to prototype and evaluate approaches to transforming classrooms from static endpoints to dynamic, content-creating nodes in the online information ecosystem. But many scholastic journalism programs have already embraced the capabilities of the Internet for virtual collaboration, dissemination, and reader participation. Because of this, scholastic journalism can act as a test-bed for integrating web-based sharing and collaboration practices into classrooms. Student Journalism 2.0 was a research project to integrate open copyright licenses into two scholastic journalism programs, to document outcomes, and to identify recommendations and remaining challenges for similar integrations. Video and audio recordings of two participating high school journalism programs informed the research. In describing the steps of our integration process, we note some important legal, technical, and social challenges. Legal worries such as uncertainty over copyright ownership could lead districts and administrators to disallow open licensing of student work. Publication platforms among journalism classrooms are far from standardized, making any integration of new technologies and practices difficult to achieve at scale. And teachers and students face challenges re-conceptualizing the role their class work can play online.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We examine the scale invariants in the preparation of highly concentrated w/o emulsions at different scales and in varying conditions. The emulsions are characterized using rheological parameters, owing to their highly elastic behavior. We first construct and validate empirical models to describe the rheological properties. These models yield a reasonable prediction of experimental data. We then build an empirical scale-up model, to predict the preparation and composition conditions that have to be kept constant at each scale to prepare the same emulsion. For this purpose, three preparation scales with geometric similarity are used. The parameter N¿D^α, as a function of the stirring rate N, the scale (D, impeller diameter) and the exponent α (calculated empirically from the regression of all the experiments in the three scales), is defined as the scale invariant that needs to be optimized, once the dispersed phase of the emulsion, the surfactant concentration, and the dispersed phase addition time are set. As far as we know, no other study has obtained a scale invariant factor N¿Dα for the preparation of highly concentrated emulsions prepared at three different scales, which covers all three scales, different addition times and surfactant concentrations. The power law exponent obtained seems to indicate that the scale-up criterion for this system is the power input per unit volume (P/V).