49 resultados para Model-Data Integration and Data Assimilation
Resumo:
We introduce two coupled map lattice models with nonconservative interactions and a continuous nonlinear driving. Depending on both the degree of conservation and the convexity of the driving we find different behaviors, ranging from self-organized criticality, in the sense that the distribution of events (avalanches) obeys a power law, to a macroscopic synchronization of the population of oscillators, with avalanches of the size of the system.
Resumo:
We consider a Potts model diluted by fully frustrated Ising spins. The model corresponds to a fully frustrated Potts model with variables having an integer absolute value and a sign. This model presents precursor phenomena of a glass transition in the high-temperature region. We show that the onset of these phenomena can be related to a thermodynamic transition. Furthermore, this transition can be mapped onto a percolation transition. We numerically study the phase diagram in two dimensions (2D) for this model with frustration and without disorder and we compare it to the phase diagram of (i) the model with frustration and disorder and (ii) the ferromagnetic model. Introducing a parameter that connects the three models, we generalize the exact expression of the ferromagnetic Potts transition temperature in 2D to the other cases. Finally, we estimate the dynamic critical exponents related to the Potts order parameter and to the energy.
Resumo:
The aim of this paper is to analyse how economic integration in Europe has affected industrial geographical concentration in Spain and explain what the driving forces behind industry location are. Firstly, we construct regional specialisation and geographical concentration indices for Spanish 50 provinces and 30 industrial sectors in 1979, 1986 and 1992. Secondly, we carry out an econometric analysis of the determinants of geographical concentration of industries. Our main conclusion is that there is no evidence of increasing specialisation in Spain between 1979 and 1992 and that the most important determinant of Spain¿s economic geography is scale economies. Furthermore, traditional trade theory has no effects in explaining the pattern of industrial concentration
Resumo:
[cat] En aquest treball es presenta un model eclèctic que sistematitza la dinàmica de les crisis que s’autoconfimen, usant els principals aspectes de les tres tipologies dels models de crisis canviàries de tercera generació, amb la finalitat de descriure els fets que precipiten la renúncia al manteniment d’una paritat fixada. Les contribucions més notables són les implicacions per a la política econòmica, així com la pèrdua del paper del tipus de canvi com instrument d’ajust macroeconòmic, quan els efectes de balanç són una possibilitat real.
Resumo:
With the beginning of the European Monetary Union (EMU), euro-area sovereign securities¿ adjusted spreads over Germany (corrected from the foreign exchange risk) experienced an increase that caused a lower than expected decline in borrowing costs. The objective of this paper is to study what explains that rising. In particular, if it took place a change in the price assigned by markets to domestic (credit risk and/or market liquidity) or to international risk factors. The empirical evidence supports the idea that a change in the market value of liquidity occurred with the EMU. International and default risk play a smaller role
Resumo:
The aim of this paper is to analyse how economic integration in Europe has affected industrial geographical concentration in Spain and explain what the driving forces behind industry location are. Firstly, we construct regional specialisation and geographical concentration indices for Spanish 50 provinces and 30 industrial sectors in 1979, 1986 and 1992. Secondly, we carry out an econometric analysis of the determinants of geographical concentration of industries. Our main conclusion is that there is no evidence of increasing specialisation in Spain between 1979 and 1992 and that the most important determinant of Spain¿s economic geography is scale economies. Furthermore, traditional trade theory has no effects in explaining the pattern of industrial concentration
Resumo:
We present a new phenomenological approach to nucleation, based on the combination of the extended modified liquid drop model and dynamical nucleation theory. The new model proposes a new cluster definition, which properly includes the effect of fluctuations, and it is consistent both thermodynamically and kinetically. The model is able to predict successfully the free energy of formation of the critical nucleus, using only macroscopic thermodynamic properties. It also accounts for the spinodal and provides excellent agreement with the result of recent simulations.
Resumo:
Since the mid-1990s researchers have sought to understand why some firms embark on e-commerceoperations on the Internet while others prefer to wait and see how events unfold. We still have todetermine which variables contribute to explaining the extent to which firms use e-commerce, giventhat to date the literature has not yet offered conclusive evidence on this question. The current workaims to provide an integrated vision of the set of factors influencing the e-commerce adoptionprocess. We use a sample of 2,038 firms of all types that trade their products either with otherorganizations or with end-consumers.
Resumo:
In this article we present a qualitative study conducted with six indigenous and six mestizos from Intercultural University of Chiapas. The aim of the study is to exemplify the mutual perception between different ethno-linguistic groups, as well as the possible change occurred after the admission to the University. That is, opinions about the other group after and before entering the University. We conclude that a higher education intercultural model can promote mutual understanding and relationship between indigenous and mestizos and thus combat prejudices and stereotypes
Resumo:
In this work discuss the use of the standard model for the calculation of the solvency capital requirement (SCR) when the company aims to use the specific parameters of the model on the basis of the experience of its portfolio. In particular, this analysis focuses on the formula presented in the latest quantitative impact study (2010 CEIOPS) for non-life underwriting premium and reserve risk. One of the keys of the standard model for premium and reserves risk is the correlation matrix between lines of business. In this work we present how the correlation matrix between lines of business could be estimated from a quantitative perspective, as well as the possibility of using a credibility model for the estimation of the matrix of correlation between lines of business that merge qualitative and quantitative perspective.
Resumo:
Experimental and theoretical investigations for growth of silicon nanoparticles (4 to 14 nm) in radio frequency discharge were carried out. Growth processes were performed with gas mixtures of SiH4 and Ar in a plasma chemical reactor at low pressure. A distinctive feature of presented kinetic model of generation and growth of nanoparticles (compared to our earlier model) is its ability to investigate small"critical" dimensions of clusters, determining the rate of particle production and taking into account the influence of SiH2 and Si2Hm dimer radicals. The experiments in the present study were extended to high pressure (≥20 Pa) and discharge power (≥40 W). Model calculations were compared to experimental measurements, investigating the dimension of silicon nanoparticles as a function of time, discharge power, gas mixture, total pressure, and gas flow.
Resumo:
Quantitative or algorithmic trading is the automatization of investments decisions obeying a fixed or dynamic sets of rules to determine trading orders. It has increasingly made its way up to 70% of the trading volume of one of the biggest financial markets such as the New York Stock Exchange (NYSE). However, there is not a signi cant amount of academic literature devoted to it due to the private nature of investment banks and hedge funds. This projects aims to review the literature and discuss the models available in a subject that publications are scarce and infrequently. We review the basic and fundamental mathematical concepts needed for modeling financial markets such as: stochastic processes, stochastic integration and basic models for prices and spreads dynamics necessary for building quantitative strategies. We also contrast these models with real market data with minutely sampling frequency from the Dow Jones Industrial Average (DJIA). Quantitative strategies try to exploit two types of behavior: trend following or mean reversion. The former is grouped in the so-called technical models and the later in the so-called pairs trading. Technical models have been discarded by financial theoreticians but we show that they can be properly cast into a well defined scientific predictor if the signal generated by them pass the test of being a Markov time. That is, we can tell if the signal has occurred or not by examining the information up to the current time; or more technically, if the event is F_t-measurable. On the other hand the concept of pairs trading or market neutral strategy is fairly simple. However it can be cast in a variety of mathematical models ranging from a method based on a simple euclidean distance, in a co-integration framework or involving stochastic differential equations such as the well-known Ornstein-Uhlenbeck mean reversal ODE and its variations. A model for forecasting any economic or financial magnitude could be properly defined with scientific rigor but it could also lack of any economical value and be considered useless from a practical point of view. This is why this project could not be complete without a backtesting of the mentioned strategies. Conducting a useful and realistic backtesting is by no means a trivial exercise since the \laws" that govern financial markets are constantly evolving in time. This is the reason because we make emphasis in the calibration process of the strategies' parameters to adapt the given market conditions. We find out that the parameters from technical models are more volatile than their counterpart form market neutral strategies and calibration must be done in a high-frequency sampling manner to constantly track the currently market situation. As a whole, the goal of this project is to provide an overview of a quantitative approach to investment reviewing basic strategies and illustrating them by means of a back-testing with real financial market data. The sources of the data used in this project are Bloomberg for intraday time series and Yahoo! for daily prices. All numeric computations and graphics used and shown in this project were implemented in MATLAB^R scratch from scratch as a part of this thesis. No other mathematical or statistical software was used.
Resumo:
Many dynamic revenue management models divide the sale period into a finite number of periods T and assume, invoking a fine-enough grid of time, that each period sees at most one booking request. These Poisson-type assumptions restrict the variability of the demand in the model, but researchers and practitioners were willing to overlook this for the benefit of tractability of the models. In this paper, we criticize this model from another angle. Estimating the discrete finite-period model poses problems of indeterminacy and non-robustness: Arbitrarily fixing T leads to arbitrary control values and on the other hand estimating T from data adds an additional layer of indeterminacy. To counter this, we first propose an alternate finite-population model that avoids this problem of fixing T and allows a wider range of demand distributions, while retaining the useful marginal-value properties of the finite-period model. The finite-population model still requires jointly estimating market size and the parameters of the customer purchase model without observing no-purchases. Estimation of market-size when no-purchases are unobservable has rarely been attempted in the marketing or revenue management literature. Indeed, we point out that it is akin to the classical statistical problem of estimating the parameters of a binomial distribution with unknown population size and success probability, and hence likely to be challenging. However, when the purchase probabilities are given by a functional form such as a multinomial-logit model, we propose an estimation heuristic that exploits the specification of the functional form, the variety of the offer sets in a typical RM setting, and qualitative knowledge of arrival rates. Finally we perform simulations to show that the estimator is very promising in obtaining unbiased estimates of population size and the model parameters.
Resumo:
The aim of this study is to analyze how European integration and, especially, changes in ownership, has affected banking efficiency in Central and Eastern European countries which have recently experimented this process more intensely. Using a stochastic frontier approach (SFA) applied to panel data, we have estimated bank efficiency levels in a sample of 189 banks from 12 countries during the period 2000 to 2008 and we have analyzed the influence of some bank characteristics on these efficiency levels. The results show that European integration has significantly improved the cost efficiency of banks in these countries but profit efficiency has significantly decreased. We have found very small differences between different ownership types and only a very small impact of foreign ownership on cost efficiency, showing that the entry of foreign ownership is not enough to explain the significant variations in banking efficiency after the accession.
Resumo:
Expectations are central to behaviour. Despite the existence of subjective expectations data, the standard approach is to ignore these, to hypothecate a model of behaviour and to infer expectations from realisations. In the context of income models, we reveal the informational gain obtained from using both a canonical model and subjective expectations data. We propose a test for this informational gain, and illustrate our approach with an application to the problem of measuring income risk.