998 resultados para Ones -- Models matemàtics -- Sau, Pantà de (Catalunya)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

La modelización hidrológica de cuencas requiere datos del territorio para hacer una correcta parametrización del modelo. La escala de entrada (grado de generalización) de los datos influirá en los cálculos de la escorrentía superficial realizados en la simulación. HEC-1 es un modelo empírico de simulación hidrológica de cuencas de amplia difusión. En la actualidad, se encuentra disponible conjuntamente con el programa informático WMS (Watershed Modeling System), que dispone de diferentes herramientas que facilitan el procesamiento de los datos del territorio. El modelo HEC-1 se ha aplicado a la cuenca de Canalda de 66 km2 (El Solsonés, Lleida) para conocer la influencia de generalizar los parámetros de entrada (usos y tipos de suelos y división en subcuencas) en el cálculo de la escorrentía superficial. Para aplicar el modelo ha sido necesario hacer un reconocimiento y estudio de los suelos de la cuenca, con énfasis especial en las propiedades físicas, la distribución y extensión de los suelos, obteniendo el mapa de suelos de la cuenca a escala 1:50.000. El proceso de generalización de los datos de entrada se ha efectuado con la escala base 1:50.000, realizando otras simulaciones a las escalas 1:100.000 y 1:200.000. En las simulaciones practicadas se obseva que el grado de generalización de los datos de entrada tiene efecto en el hidrograma de salida de la cuenca; al generalizar los datos a las escalas mencionadas se aprecia un retardo en el hidrograma y una reducción de la aportación total y del caudal punta.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this study is to define a new statistic, PVL, based on the relative distance between the likelihood associated with the simulation replications and the likelihood of the conceptual model. Our results coming from several simulation experiments of a clinical trial show that the PVL statistic range can be a good measure of stability to establish when a computational model verifies the underlying conceptual model. PVL improves also the analysis of simulation replications because only one statistic is associated with all the simulation replications. As well it presents several verification scenarios, obtained by altering the simulation model, that show the usefulness of PVL. Further simulation experiments suggest that a 0 to 20 % range may define adequate limits for the verification problem, if considered from the viewpoint of an equivalence test.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Improving educational quality is an important public policy goal. However, its success requires identifying factors associated with student achievement. At the core of these proposals lies the principle that increased public school quality can make school system more efficient, resulting in correspondingly stronger performance by students. Nevertheless, the public educational system is not devoid of competition which arises, among other factors, through the efficiency of management and the geographical location of schools. Moreover, families in Spain appear to choose a school on the grounds of location. In this environment, the objective of this paper is to analyze whether geographical space has an impact on the relationship between the level of technical quality of public schools (measured by the efficiency score) and the school demand index. To do this, an empirical application is performed on a sample of 1,695 public schools in the region of Catalonia (Spain). This application shows the effects of spatial autocorrelation on the estimation of the parameters and how these problems are addressed through spatial econometrics models. The results confirm that space has a moderating effect on the relationship between efficiency and school demand, although only in urban municipalities.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Some bilingual societies exhibit a distribution of language skills that can- not be explained by economic theories that portray languages as pure commu- nication devices. Such distribution of skills are typically the result of public policies that promote bilingualism among members of both speech commu- nities (reciprocal bilingualism). In this paper I argue that these policies are likely to increase social welfare by diminishing economic and social segmenta- tion between the two communities. However, these gains tend to be unequally distributed over the two communities. As a result, in a large range of circum- stances these policies might not draw su¢ cient support. The model is built upon the communicative value of languages, but also emphasizes the role of linguistic preferences in the behavior of bilingual individuals.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We analyze a unidimensional model of two-candidate electoral competition where voters have im- perfect information about the candidates' policy proposals, that is, voters cannot observe the exact policy proposals of the candidates but only which candidate offers the most leftist/rightist platform. We assume that candidates are purely office motivated and that one candidate enjoys a valence advan- tage over the other. We characterize the unique Sequential Equilibrium in very-weakly undominated strategies of the game. In this equilibrium the behavior of the two candidates tends to maximum extremism, due to the voters' lack of information. But it may converge or diverge depending on the size of the advantage. For small values of the advantage candidates converge to the extreme policy most preferred by the median and for large values of the advantage candidates strategies diverge: each candidate specializes in a different extreme policy. These results are robust to the introduction of a proportion of well informed voters. In this case the degree of extremism decreases when the voters become more informed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A version of Matheron’s discrete Gaussian model is applied to cell composition data. The examples are for map patterns of felsic metavolcanics in two different areas. Q-Q plots of the model for cell values representing proportion of 10 km x 10 km cell area underlain by this rock type are approximately linear, and the line of best fit can be used to estimate the parameters of the model. It is also shown that felsic metavolcanics in the Abitibi area of the Canadian Shield can be modeled as a fractal

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There are two principal chemical concepts that are important for studying the natural environment. The first one is thermodynamics, which describes whether a system is at equilibrium or can spontaneously change by chemical reactions. The second main concept is how fast chemical reactions (kinetics or rate of chemical change) take place whenever they start. In this work we examine a natural system in which both thermodynamics and kinetic factors are important in determining the abundance of NH+4 , NO−2 and NO−3 in superficial waters. Samples were collected in the Arno Basin (Tuscany, Italy), a system in which natural and antrophic effects both contribute to highly modify the chemical composition of water. Thermodynamical modelling based on the reduction-oxidation reactions involving the passage NH+4 -> NO−2 -> NO−3 in equilibrium conditions has allowed to determine the Eh redox potential values able to characterise the state of each sample and, consequently, of the fluid environment from which it was drawn. Just as pH expresses the concentration of H+ in solution, redox potential is used to express the tendency of an environment to receive or supply electrons. In this context, oxic environments, as those of river systems, are said to have a high redox potential because O2 is available as an electron acceptor. Principles of thermodynamics and chemical kinetics allow to obtain a model that often does not completely describe the reality of natural systems. Chemical reactions may indeed fail to achieve equilibrium because the products escape from the site of the rection or because reactions involving the trasformation are very slow, so that non-equilibrium conditions exist for long periods. Moreover, reaction rates can be sensitive to poorly understood catalytic effects or to surface effects, while variables as concentration (a large number of chemical species can coexist and interact concurrently), temperature and pressure can have large gradients in natural systems. By taking into account this, data of 91 water samples have been modelled by using statistical methodologies for compositional data. The application of log–contrast analysis has allowed to obtain statistical parameters to be correlated with the calculated Eh values. In this way, natural conditions in which chemical equilibrium is hypothesised, as well as underlying fast reactions, are compared with those described by a stochastic approach

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The composition of the labour force is an important economic factor for a country. Often the changes in proportions of different groups are of interest. I this paper we study a monthly compositional time series from the Swedish Labour Force Survey from 1994 to 2005. Three models are studied: the ILR-transformed series, the ILR-transformation of the compositional differenced series of order 1, and the ILRtransformation of the compositional differenced series of order 12. For each of the three models a VAR-model is fitted based on the data 1994-2003. We predict the time series 15 steps ahead and calculate 95 % prediction regions. The predictions of the three models are compared with actual values using MAD and MSE and the prediction regions are compared graphically in a ternary time series plot. We conclude that the first, and simplest, model possesses the best predictive power of the three models

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Dirichlet family owes its privileged status within simplex distributions to easyness of interpretation and good mathematical properties. In particular, we recall fundamental properties for the analysis of compositional data such as closure under amalgamation and subcomposition. From a probabilistic point of view, it is characterised (uniquely) by a variety of independence relationships which makes it indisputably the reference model for expressing the non trivial idea of substantial independence for compositions. Indeed, its well known inadequacy as a general model for compositional data stems from such an independence structure together with the poorness of its parametrisation. In this paper a new class of distributions (called Flexible Dirichlet) capable of handling various dependence structures and containing the Dirichlet as a special case is presented. The new model exhibits a considerably richer parametrisation which, for example, allows to model the means and (part of) the variance-covariance matrix separately. Moreover, such a model preserves some good mathematical properties of the Dirichlet, i.e. closure under amalgamation and subcomposition with new parameters simply related to the parent composition parameters. Furthermore, the joint and conditional distributions of subcompositions and relative totals can be expressed as simple mixtures of two Flexible Dirichlet distributions. The basis generating the Flexible Dirichlet, though keeping compositional invariance, shows a dependence structure which allows various forms of partitional dependence to be contemplated by the model (e.g. non-neutrality, subcompositional dependence and subcompositional non-invariance), independence cases being identified by suitable parameter configurations. In particular, within this model substantial independence among subsets of components of the composition naturally occurs when the subsets have a Dirichlet distribution

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Theory of compositional data analysis is often focused on the composition only. However in practical applications we often treat a composition together with covariables with some other scale. This contribution systematically gathers and develop statistical tools for this situation. For instance, for the graphical display of the dependence of a composition with a categorical variable, a colored set of ternary diagrams might be a good idea for a first look at the data, but it will fast hide important aspects if the composition has many parts, or it takes extreme values. On the other hand colored scatterplots of ilr components could not be very instructive for the analyst, if the conventional, black-box ilr is used. Thinking on terms of the Euclidean structure of the simplex, we suggest to set up appropriate projections, which on one side show the compositional geometry and on the other side are still comprehensible by a non-expert analyst, readable for all locations and scales of the data. This is e.g. done by defining special balance displays with carefully- selected axes. Following this idea, we need to systematically ask how to display, explore, describe, and test the relation to complementary or explanatory data of categorical, real, ratio or again compositional scales. This contribution shows that it is sufficient to use some basic concepts and very few advanced tools from multivariate statistics (principal covariances, multivariate linear models, trellis or parallel plots, etc.) to build appropriate procedures for all these combinations of scales. This has some fundamental implications in their software implementation, and how might they be taught to analysts not already experts in multivariate analysis

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It can be assumed that the composition of Mercury’s thin gas envelope (exosphere) is related to the composition of the planets crustal materials. If this relationship is true, then inferences regarding the bulk chemistry of the planet might be made from a thorough exospheric study. The most vexing of all unsolved problems is the uncertainty in the source of each component. Historically, it has been believed that H and He come primarily from the solar wind, while Na and K originate from volatilized materials partitioned between Mercury’s crust and meteoritic impactors. The processes that eject atoms and molecules into the exosphere of Mercury are generally considered to be thermal vaporization, photonstimulated desorption (PSD), impact vaporization, and ion sputtering. Each of these processes has its own temporal and spatial dependence. The exosphere is strongly influenced by Mercury’s highly elliptical orbit and rapid orbital speed. As a consequence the surface undergoes large fluctuations in temperature and experiences differences of insolation with longitude. We will discuss these processes but focus more on the expected surface composition and solar wind particle sputtering which releases material like Ca and other elements from the surface minerals and discuss the relevance of composition modelling

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The basis set superposition error-free second-order MØller-Plesset perturbation theory of intermolecular interactions was studied. The difficulties of the counterpoise (CP) correction in open-shell systems were also discussed. The calculations were performed by a program which was used for testing the new variants of the theory. It was shown that the CP correction for the diabatic surfaces should be preferred to the adiabatic ones

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Procesos hidrodinámicos determinan, en un alto grado la calidad del agua en embalse, sin embargo dichos procesos han sido tradicionalmente olvidados en la gestión de embalse. En esta tesis se presentan evidencias de los principales procesos hidrodinámicos que ocurren en un embalse Mediterráneo a escala de cuenca a través de campañas experimentales y modelización numérica; y su influencia en la dinámica de poblaciones de fitoplancton. Dichos procesos son principalmente la generación de ondas internas o secas y la intrusión del río. La presencia de viento periódico genera secas forzadas, amplificando los modos cercanos al periodo del viento, de manera que modos verticales altos, considerados como raros en la naturaleza, tienden a dominar en el sistema.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

En el presente trabajo se diseñan tres modelos DEA a partir de un sistema de producción cuyos componentes están colocados en un arreglo en serie que se integran verticalmente hacia adelante. El primer modelo busca optimizar los beneficios del sistema agregado, así como la mejora de los mismos en cada uno de los subsistemas. En el segundo de los modelos, además del objetivo anterior, se incluyen restricciones de transferencia de los recursos específicos asociados a cada subsistema, y en el tercer modelo se estima el intervalo de variación para los precios de transferencia de los inputs intermedios entre ambos subsistemas. Los modelos han sido programados y simulados en el software GAMS a partir de datos generados por una función de producción Cobb-Douglas para los inputs intermedios y los outputs finales.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Esta investigación relaciona el margen de solvencia que normativamente deben acreditar los aseguradores de vida con los recursos propios que cualquier empresa debe poseer para poder ejercer su actividad. Analizaremos la incidencia que la remuneración de esos recursos tiene en la propia actividad de comercialización de los seguros demostrando su relación inversa con el tipo de interés garantizado en los contratos. Analizaremos asimismo la incidencia que los eventuales cambios en los tipos de interés pueden tener en la remuneración que puede ofrecerse a esos recursos y propondremos una ecuación que incorpore todos estos aspectos, comprobando que las relaciones que actualmente ofrece la disciplina académica son casos particulares de la ecuación general, casos que demostraremos que incorporan unas hipótesis implícitas bastante restrictivas.