69 resultados para Lambda calculus
Resumo:
We show that the Heston volatility or equivalently the Cox-Ingersoll-Ross process is Malliavin differentiable and give an explicit expression for the derivative. This result assures the applicability of Malliavin calculus in the framework of the Heston stochastic volatility model and the Cox-Ingersoll-Ross model for interest rates.
Resumo:
We present a rule-based Huet’s style anti-unification algorithm for simply-typed lambda-terms in ɳ long β normal form, which computes a least general higher-order pattern generalization. For a pair of arbitrary terms of the same type, such a generalization always exists and is unique modulo α equivalence and variable renaming. The algorithm computes it in cubic time within linear space. It has been implemented and the code is freely available
Resumo:
Floods are the natural hazards that produce the highest number of casualties and material damage in the Western Mediterranean. An improvement in flood risk assessment and study of a possible increase in flooding occurrence are therefore needed. To carry out these tasks it is important to have at our disposal extensive knowledge on historical floods and to find an efficient way to manage this geographical data. In this paper we present a complete flood database spanning the 20th century for the whole of Catalonia (NE Spain), which includes documentary information (affected areas and damage) and instrumental information (meteorological and hydrological records). This geodatabase, named Inungama, has been implemented on a GIS (Geographical Information System) in order to display all the information within a given geographical scenario, as well as to carry out an analysis thereof using queries, overlays and calculus. Following a description of the type and amount of information stored in the database and the structure of the information system, the first applications of Inungama are presented. The geographical distribution of floods shows the localities which are more likely to be flooded, confirming that the most affected municipalities are the most densely populated ones in coastal areas. Regarding the existence of an increase in flooding occurrence, a temporal analysis has been carried out, showing a steady increase over the last 30 years.
Resumo:
We study spacetime diffeomorphisms in the Hamiltonian and Lagrangian formalisms of generally covariant systems. We show that the gauge group for such a system is characterized by having generators which are projectable under the Legendre map. The gauge group is found to be much larger than the original group of spacetime diffeomorphisms, since its generators must depend on the lapse function and shift vector of the spacetime metric in a given coordinate patch. Our results are generalizations of earlier results by Salisbury and Sundermeyer. They arise in a natural way from using the requirement of equivalence between Lagrangian and Hamiltonian formulations of the system, and they are new in that the symmetries are realized on the full set of phase space variables. The generators are displayed explicitly and are applied to the relativistic string and to general relativity.
Resumo:
Several NdFeB compositionally modulated thin films are studied by using both conversion electron Mossbauer spectra and SQUID (superconducting quantum-interference-device) magnetometry. Both the hyperfine fields and the easy magnetization magnitude are not correlated with the modulation characteristic length (lambda) while the magnetization perpendicular to the thin-film plane decreases as lambda increases. The spectra were recorded at room temperature being the gamma rays perpendicular to the substrate plane. The magnetization measurements were recorded by using a SHE SQUID magnetometer in applied magnetic fields up to 5.5 T and in the temperature range between 1.8 and 30 K.
Resumo:
We derive the chaotic expansion of the product of nth- and first-order multiple stochastic integrals with respect to certain normal martingales. This is done by application of the classical and quantum product formulae for multiple stochastic integrals. Our approach extends existing results on chaotic calculus for normal martingales and exhibits properties, relative to multiple stochastic integrals, polynomials and Wick products, that characterize the Wiener and Poisson processes.
Resumo:
Effect size indices are indispensable for carrying out meta-analyses and can also be seen as an alternative for making decisions about the effectiveness of a treatment in an individual applied study. The desirable features of the procedures for quantifying the magnitude of intervention effect include educational/clinical meaningfulness, calculus easiness, insensitivity to autocorrelation, low false alarm and low miss rates. Three effect size indices related to visual analysis are compared according to the aforementioned criteria. The comparison is made by means of data sets with known parameters: degree of serial dependence, presence or absence of general trend, changes in level and/or in slope. The percent of nonoverlapping data showed the highest discrimination between data sets with and without intervention effect. In cases when autocorrelation or trend is present, the percentage of data points exceeding the median may be a better option to quantify the effectiveness of a psychological treatment.
Resumo:
The present study proposes a modification in one of the most frequently applied effect size procedures in single-case data analysis the percent of nonoverlapping data. In contrast to other techniques, the calculus and interpretation of this procedure is straightforward and it can be easily complemented by visual inspection of the graphed data. Although the percent of nonoverlapping data has been found to perform reasonably well in N = 1 data, the magnitude of effect estimates it yields can be distorted by trend and autocorrelation. Therefore, the data correction procedure focuses on removing the baseline trend from data prior to estimating the change produced in the behavior due to intervention. A simulation study is carried out in order to compare the original and the modified procedures in several experimental conditions. The results suggest that the new proposal is unaffected by trend and autocorrelation and can be used in case of unstable baselines and sequentially related measurements.
Resumo:
We evaluate the performance of different optimization techniques developed in the context of optical flow computation with different variational models. In particular, based on truncated Newton methods (TN) that have been an effective approach for large-scale unconstrained optimization, we de- velop the use of efficient multilevel schemes for computing the optical flow. More precisely, we evaluate the performance of a standard unidirectional mul- tilevel algorithm - called multiresolution optimization (MR/OPT), to a bidrec- tional multilevel algorithm - called full multigrid optimization (FMG/OPT). The FMG/OPT algorithm treats the coarse grid correction as an optimiza- tion search direction and eventually scales it using a line search. Experimental results on different image sequences using four models of optical flow com- putation show that the FMG/OPT algorithm outperforms both the TN and MR/OPT algorithms in terms of the computational work and the quality of the optical flow estimation.
Resumo:
Let $ E_{\lambda}(z)=\lambda {\rm exp}(z), \lambda\in \mathbb{C}$, be the complex exponential family. For all functions in the family there is a unique asymptotic value at 0 (and no critical values). For a fixed $ \lambda$, the set of points in $ \mathbb{C}$ with orbit tending to infinity is called the escaping set. We prove that the escaping set of $ E_{\lambda}$ with $ \lambda$ Misiurewicz (that is, a parameter for which the orbit of the singular value is strictly preperiodic) is a connected set.
Resumo:
[Eng] We study the marginal worth vectors and their convex hull, the socalled Weber set, from the original coalitional game and the transformed one, which is called the Weber set of level k. We prove that the core of the original game is included in each of the Weber set of level k, for any k, and that the Weber sets of consecutive levels form a chain if and only if the original game is 0-monotone. Even if the game is not 0-monotone, the intersection of the Weber sets for consecutive levels is always not empty, what is not the case for non-consecutive ones. Spanish education system.
Resumo:
[Eng] We study the marginal worth vectors and their convex hull, the socalled Weber set, from the original coalitional game and the transformed one, which is called the Weber set of level k. We prove that the core of the original game is included in each of the Weber set of level k, for any k, and that the Weber sets of consecutive levels form a chain if and only if the original game is 0-monotone. Even if the game is not 0-monotone, the intersection of the Weber sets for consecutive levels is always not empty, what is not the case for non-consecutive ones. Spanish education system.
Resumo:
The final year project came to us as an opportunity to get involved in a topic which has appeared to be attractive during the learning process of majoring in economics: statistics and its application to the analysis of economic data, i.e. econometrics.Moreover, the combination of econometrics and computer science is a very hot topic nowadays, given the Information Technologies boom in the last decades and the consequent exponential increase in the amount of data collected and stored day by day. Data analysts able to deal with Big Data and to find useful results from it are verydemanded in these days and, according to our understanding, the work they do, although sometimes controversial in terms of ethics, is a clear source of value added both for private corporations and the public sector. For these reasons, the essence of this project is the study of a statistical instrument valid for the analysis of large datasets which is directly related to computer science: Partial Correlation Networks.The structure of the project has been determined by our objectives through the development of it. At first, the characteristics of the studied instrument are explained, from the basic ideas up to the features of the model behind it, with the final goal of presenting SPACE model as a tool for estimating interconnections in between elements in large data sets. Afterwards, an illustrated simulation is performed in order to show the power and efficiency of the model presented. And at last, the model is put into practice by analyzing a relatively large data set of real world data, with the objective of assessing whether the proposed statistical instrument is valid and useful when applied to a real multivariate time series. In short, our main goals are to present the model and evaluate if Partial Correlation Network Analysis is an effective, useful instrument and allows finding valuable results from Big Data.As a result, the findings all along this project suggest the Partial Correlation Estimation by Joint Sparse Regression Models approach presented by Peng et al. (2009) to work well under the assumption of sparsity of data. Moreover, partial correlation networks are shown to be a very valid tool to represent cross-sectional interconnections in between elements in large data sets.The scope of this project is however limited, as there are some sections in which deeper analysis would have been appropriate. Considering intertemporal connections in between elements, the choice of the tuning parameter lambda, or a deeper analysis of the results in the real data application are examples of aspects in which this project could be completed.To sum up, the analyzed statistical tool has been proved to be a very useful instrument to find relationships that connect the elements present in a large data set. And after all, partial correlation networks allow the owner of this set to observe and analyze the existing linkages that could have been omitted otherwise.