181 resultados para Parametric Models


Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a non-equilibrium theory in a system with heat and radiative fluxes. The obtained expression for the entropy production is applied to a simple one-dimensional climate model based on the first law of thermodynamics. In the model, the dissipative fluxes are assumed to be independent variables, following the criteria of the Extended Irreversible Thermodynamics (BIT) that enlarges, in reference to the classical expression, the applicability of a macroscopic thermodynamic theory for systems far from equilibrium. We analyze the second differential of the classical and the generalized entropy as a criteria of stability of the steady states. Finally, the extreme state is obtained using variational techniques and observing that the system is close to the maximum dissipation rate

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Classic climatic models use constitutive laws without any response time. A more realistic approach to the natural processes governing climate dynamics must introduce response time for heat and radiation fluxes. Extended irreversible thermodynamics (EIT) is a good thermodynamical framework for introducing nonclassical constitutive laws. In the present study EIT has been used to analyze a Budyko–Sellers one-dimensional energybalance model developed by G. R. North. The results present self-sustained periodic oscillations when the response time is greater than a critical value. The high-frequency (few kiloyears) damped and nondamped oscillations obtained can be related to abrupt climatic changes without any variation in the external forcing of the system

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Here I develop a model of a radiative-convective atmosphere with both radiative and convective schemes highly simplified. The atmospheric absorption of radiation at selective wavelengths makes use of constant mass absorption coefficients in finite width spectral bands. The convective regime is introduced by using a prescribed lapse rate in the troposphere. The main novelty of the radiative-convective model developed here is that it is solved without using any angular approximation for the radiation field. The solution obtained in the purely radiation mode (i. e. with convection ignored) leads to multiple equilibria of stable states, being very similar to some results recently found in simple models of planetary atmospheres. However, the introduction of convective processes removes the multiple equilibria of stable states. This shows the importance of taking convective processes into account even for qualitative analyses of planetary atmosphere

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The second differential of the entropy is used for analysing the stability of a thermodynamic climatic model. A delay time for the heat flux is introduced whereby it becomes an independent variable. Two different expressions for the second differential of the entropy are used: one follows classical irreversible thermodynamics theory; the second is related to the introduction of response time and is due to the extended irreversible thermodynamics theory. the second differential of the classical entropy leads to unstable solutions for high values of delay times. the extended expression always implies stable states for an ice-free earth. When the ice-albedo feedback is included, a discontinuous distribution of stable states is found for high response times. Following the thermodynamic analysis of the model, the maximum rates of entropy production at the steady state are obtained. A latitudinally isothermal earth produces the extremum in global entropy production. the material contribution to entropy production (by which we mean the production of entropy by material transport of heat) is a maximum when the latitudinal distribution of temperatures becomes less homogeneous than present values

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the scope of the European project Hydroptimet, INTERREG IIIB-MEDOCC programme, limited area model (LAM) intercomparison of intense events that produced many damages to people and territory is performed. As the comparison is limited to single case studies, the work is not meant to provide a measure of the different models' skill, but to identify the key model factors useful to give a good forecast on such a kind of meteorological phenomena. This work focuses on the Spanish flash-flood event, also known as "Montserrat-2000" event. The study is performed using forecast data from seven operational LAMs, placed at partners' disposal via the Hydroptimet ftp site, and observed data from Catalonia rain gauge network. To improve the event analysis, satellite rainfall estimates have been also considered. For statistical evaluation of quantitative precipitation forecasts (QPFs), several non-parametric skill scores based on contingency tables have been used. Furthermore, for each model run it has been possible to identify Catalonia regions affected by misses and false alarms using contingency table elements. Moreover, the standard "eyeball" analysis of forecast and observed precipitation fields has been supported by the use of a state-of-the-art diagnostic method, the contiguous rain area (CRA) analysis. This method allows to quantify the spatial shift forecast error and to identify the error sources that affected each model forecasts. High-resolution modelling and domain size seem to have a key role for providing a skillful forecast. Further work is needed to support this statement, including verification using a wider observational data set.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Study of the publication models and the means of accessing scientific literature in the current environment of digital communication and the web. The text introduces the concept of journal article as a well-defined and stable unit within the publishing world, and as a nucleus on which professional and scholarly communication has been based since its beginnings in the 17th century. The transformation of scientific communication that the digital world has enabled is analysed. Descriptions are provided of some of the practices undertaken by authors, research organisations, publishers and library-related institutions as a response to the new possibilities being unveiled for articles, both as products as well as for their creation and distribution processes. These transformations affect the very nature of articles as a minimal unit -both unique and stable- of scientific communication. The article concludes by noting that under varying documentary forms of publisher aggregation and bibliographic control -sometimes simultaneously and, even, apparently contradictory- there flourishes a more pluralistic type of scientific communication. This pluralism offers: more possibilities for communication among authors; fewer levels of intermediaries such as agents that intervene and provide added value to the products; greater availability for users both economically speaking and from the point of view of access; and greater interaction and wealth of contents, thanks to the new hypertext and multimedia possibilities.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

First, we examine the context of creation of special collection units in libraries, and the reasons why libraries compile archive materials and collections. Second, we focus on the techniques used in library environments to describe archive materials and collections and to guarantee their accessibility. We examine the models used in the United States and the United Kingdom to describe and access these materials, and the cooperative projects launched in these two countries in the past few years. Finally, we offer a preliminary analysis of how these types of materials are currently dealt with in Catalan libraries, and issue some recommendations to improve their archiving and access.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aquest treball de recerca, realizat amb mestres especialistes de música de l'etapa primària, exposa diversos models d'interpretació de la cançó, prèvia exposició dels diversos elements que en configuren el caràcter.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The paper addresses the concept of multicointegration in panel data frame- work. The proposal builds upon the panel data cointegration procedures developed in Pedroni (2004), for which we compute the moments of the parametric statistics. When individuals are either cross-section independent or cross-section dependence can be re- moved by cross-section demeaning, our approach can be applied to the wider framework of mixed I(2) and I(1) stochastic processes analysis. The paper also deals with the issue of cross-section dependence using approximate common factor models. Finite sample performance is investigated through Monte Carlo simulations. Finally, we illustrate the use of the procedure investigating inventories, sales and production relationship for a panel of US industries.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Whereas numerical modeling using finite-element methods (FEM) can provide transient temperature distribution in the component with enough accuracy, it is of the most importance the development of compact dynamic thermal models that can be used for electrothermal simulation. While in most cases single power sources are considered, here we focus on the simultaneous presence of multiple sources. The thermal model will be in the form of a thermal impedance matrix containing the thermal impedance transfer functions between two arbitrary ports. Eachindividual transfer function element ( ) is obtained from the analysis of the thermal temperature transient at node ¿ ¿ after a power step at node ¿ .¿ Different options for multiexponential transient analysis are detailed and compared. Among the options explored, small thermal models can be obtained by constrained nonlinear least squares (NLSQ) methods if the order is selected properly using validation signals. The methods are applied to the extraction of dynamic compact thermal models for a new ultrathin chip stack technology (UTCS).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Gas sensing systems based on low-cost chemical sensor arrays are gaining interest for the analysis of multicomponent gas mixtures. These sensors show different problems, e.g., nonlinearities and slow time-response, which can be partially solved by digital signal processing. Our approach is based on building a nonlinear inverse dynamic system. Results for different identification techniques, including artificial neural networks and Wiener series, are compared in terms of measurement accuracy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The paper addresses the concept of multicointegration in panel data frame- work. The proposal builds upon the panel data cointegration procedures developed in Pedroni (2004), for which we compute the moments of the parametric statistics. When individuals are either cross-section independent or cross-section dependence can be re- moved by cross-section demeaning, our approach can be applied to the wider framework of mixed I(2) and I(1) stochastic processes analysis. The paper also deals with the issue of cross-section dependence using approximate common factor models. Finite sample performance is investigated through Monte Carlo simulations. Finally, we illustrate the use of the procedure investigating inventories, sales and production relationship for a panel of US industries.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we highlight the importance of the operational costs in explaining economic growth and analyze how the industrial structure affects the growth rate of the economy. If there is monopolistic competition only in an intermediate goods sector, then production growth coincides with consumption growth. Moreover, the pattern of growth depends on the particular form of the operational cost. If the monopolistically competitive sector is the final goods sector, then per capita production is constant but per capita effective consumption or welfare grows. Finally, we modify again the industrial structure of the economy and show an economy with two different growth speeds, one for production and another for effective consumption. Thus, both the operational cost and the particular structure of the sector that produces the final goods determines ultimately the pattern of growth.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

[eng] This paper provides, from a theoretical and quantitative point of view, an explanation of why taxes on capital returns are high (around 35%) by analyzing the optimal fiscal policy in an economy with intergenerational redistribution. For this purpose, the government is modeled explicitly and can choose (and commit to) an optimal tax policy in order to maximize society's welfare. In an infinitely lived economy with heterogeneous agents, the long run optimal capital tax is zero. If heterogeneity is due to the existence of overlapping generations, this result in general is no longer true. I provide sufficient conditions for zero capital and labor taxes, and show that a general class of preferences, commonly used on the macro and public finance literature, violate these conditions. For a version of the model, calibrated to the US economy, the main results are: first, if the government is restricted to a set of instruments, the observed fiscal policy cannot be disregarded as sub optimal and capital taxes are positive and quantitatively relevant. Second, if the government can use age specific taxes for each generation, then the age profile capital tax pattern implies subsidizing asset returns of the younger generations and taxing at higher rates the asset returns of the older ones.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we deal with the identification of dependencies between time series of equity returns. Marginal distribution functions are assumed to be known, and a bivariate chi-square test of fit is applied in a fully parametric copula approach. Several families of copulas are fitted and compared with Spanish stock market data. The results show that the t-copula generally outperforms other dependence structures, and highlight the difficulty in adjusting a significant number of bivariate data series