925 resultados para Symbolic computation and algebraic computation


Relevância:

40.00% 40.00%

Publicador:

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper analyzes the measure of systemic importance ∆CoV aR proposed by Adrian and Brunnermeier (2009, 2010) within the context of a similar class of risk measures used in the risk management literature. In addition, we develop a series of testing procedures, based on ∆CoV aR, to identify and rank the systemically important institutions. We stress the importance of statistical testing in interpreting the measure of systemic importance. An empirical application illustrates the testing procedures, using equity data for three European banks.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We look at at the empirical validity of Schelling’s models for racial residential segregation applied to the case of Chicago. Most of the empirical literature has focused exclusively the single neighborhood model, also known as the tipping point model and neglected a multineighborhood approach or a unified approach. The multi-neighborhood approach introduced spatial interaction across the neighborhoods, in particular we look at spatial interaction across neighborhoods sharing a border. An initial exploration of the data indicates that spatial contiguity might be relevant to properly analyse the so call tipping phenomena of predominately non-Hispanic white neighborhoods to predominantly minority neighborhoods within a decade. We introduce an econometric model that combines an approach to estimate tipping point using threshold effects and a spatial autoregressive model. The estimation results from the model disputes the existence of a tipping point, that is a discontinuous change in the rate of growth of the non-Hispanic white population due to a small increase in the minority share of the neighborhood. In addition we find that racial distance between the neighborhood of interest and it surrounding neighborhoods has an important effect on the dynamics of racial segregation in Chicago.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Es defineix l'expansió general d'operadors com una combinació lineal de projectors i s'exposa la seva aplicació generalitzada al càlcul d'integrals moleculars. Com a exemple numèric, es fa l'aplicació al càlcul d'integrals de repulsió electrònica entre quatre funcions de tipus s centrades en punts diferents, i es mostren tant resultats del càlcul com la definició d'escalat respecte a un valor de referència, que facilitarà el procés d'optimització de l'expansió per uns paràmetres arbitraris. Es donen resultats ajustats al valor exacte

Relevância:

40.00% 40.00%

Publicador:

Resumo:

La present tesi proposa una metodología per a la simulació probabilística de la fallada de la matriu en materials compòsits reforçats amb fibres de carboni, basant-se en l'anàlisi de la distribució aleatòria de les fibres. En els primers capítols es revisa l'estat de l'art sobre modelització matemàtica de materials aleatoris, càlcul de propietats efectives i criteris de fallada transversal en materials compòsits. El primer pas en la metodologia proposada és la definició de la determinació del tamany mínim d'un Element de Volum Representatiu Estadístic (SRVE) . Aquesta determinació es du a terme analitzant el volum de fibra, les propietats elàstiques efectives, la condició de Hill, els estadístics de les components de tensió i defromació, la funció de densitat de probabilitat i les funcions estadístiques de distància entre fibres de models d'elements de la microestructura, de diferent tamany. Un cop s'ha determinat aquest tamany mínim, es comparen un model periòdic i un model aleatori, per constatar la magnitud de les diferències que s'hi observen. Es defineix, també, una metodologia per a l'anàlisi estadístic de la distribució de la fibra en el compòsit, a partir d'imatges digitals de la secció transversal. Aquest anàlisi s'aplica a quatre materials diferents. Finalment, es proposa un mètode computacional de dues escales per a simular la fallada transversal de làmines unidireccionals, que permet obtenir funcions de densitat de probabilitat per a les variables mecàniques. Es descriuen algunes aplicacions i possibilitats d'aquest mètode i es comparen els resultats obtinguts de la simulació amb valors experimentals.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The characteristics of service independence and flexibility of ATM networks make the control problems of such networks very critical. One of the main challenges in ATM networks is to design traffic control mechanisms that enable both economically efficient use of the network resources and desired quality of service to higher layer applications. Window flow control mechanisms of traditional packet switched networks are not well suited to real time services, at the speeds envisaged for the future networks. In this work, the utilisation of the Probability of Congestion (PC) as a bandwidth decision parameter is presented. The validity of PC utilisation is compared with QOS parameters in buffer-less environments when only the cell loss ratio (CLR) parameter is relevant. The convolution algorithm is a good solution for CAC in ATM networks with small buffers. If the source characteristics are known, the actual CLR can be very well estimated. Furthermore, this estimation is always conservative, allowing the retention of the network performance guarantees. Several experiments have been carried out and investigated to explain the deviation between the proposed method and the simulation. Time parameters for burst length and different buffer sizes have been considered. Experiments to confine the limits of the burst length with respect to the buffer size conclude that a minimum buffer size is necessary to achieve adequate cell contention. Note that propagation delay is a no dismiss limit for long distance and interactive communications, then small buffer must be used in order to minimise delay. Under previous premises, the convolution approach is the most accurate method used in bandwidth allocation. This method gives enough accuracy in both homogeneous and heterogeneous networks. But, the convolution approach has a considerable computation cost and a high number of accumulated calculations. To overcome this drawbacks, a new method of evaluation is analysed: the Enhanced Convolution Approach (ECA). In ECA, traffic is grouped in classes of identical parameters. By using the multinomial distribution function instead of the formula-based convolution, a partial state corresponding to each class of traffic is obtained. Finally, the global state probabilities are evaluated by multi-convolution of the partial results. This method avoids accumulated calculations and saves storage requirements, specially in complex scenarios. Sorting is the dominant factor for the formula-based convolution, whereas cost evaluation is the dominant factor for the enhanced convolution. A set of cut-off mechanisms are introduced to reduce the complexity of the ECA evaluation. The ECA also computes the CLR for each j-class of traffic (CLRj), an expression for the CLRj evaluation is also presented. We can conclude that by combining the ECA method with cut-off mechanisms, utilisation of ECA in real-time CAC environments as a single level scheme is always possible.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

La tesis se centra en la Visión por Computador y, más concretamente, en la segmentación de imágenes, la cual es una de las etapas básicas en el análisis de imágenes y consiste en la división de la imagen en un conjunto de regiones visualmente distintas y uniformes considerando su intensidad, color o textura. Se propone una estrategia basada en el uso complementario de la información de región y de frontera durante el proceso de segmentación, integración que permite paliar algunos de los problemas básicos de la segmentación tradicional. La información de frontera permite inicialmente identificar el número de regiones presentes en la imagen y colocar en el interior de cada una de ellas una semilla, con el objetivo de modelar estadísticamente las características de las regiones y definir de esta forma la información de región. Esta información, conjuntamente con la información de frontera, es utilizada en la definición de una función de energía que expresa las propiedades requeridas a la segmentación deseada: uniformidad en el interior de las regiones y contraste con las regiones vecinas en los límites. Un conjunto de regiones activas inician entonces su crecimiento, compitiendo por los píxeles de la imagen, con el objetivo de optimizar la función de energía o, en otras palabras, encontrar la segmentación que mejor se adecua a los requerimientos exprsados en dicha función. Finalmente, todo esta proceso ha sido considerado en una estructura piramidal, lo que nos permite refinar progresivamente el resultado de la segmentación y mejorar su coste computacional. La estrategia ha sido extendida al problema de segmentación de texturas, lo que implica algunas consideraciones básicas como el modelaje de las regiones a partir de un conjunto de características de textura y la extracción de la información de frontera cuando la textura es presente en la imagen. Finalmente, se ha llevado a cabo la extensión a la segmentación de imágenes teniendo en cuenta las propiedades de color y textura. En este sentido, el uso conjunto de técnicas no-paramétricas de estimación de la función de densidad para la descripción del color, y de características textuales basadas en la matriz de co-ocurrencia, ha sido propuesto para modelar adecuadamente y de forma completa las regiones de la imagen. La propuesta ha sido evaluada de forma objetiva y comparada con distintas técnicas de integración utilizando imágenes sintéticas. Además, se han incluido experimentos con imágenes reales con resultados muy positivos.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Europe's widely distributed climate modelling expertise, now organized in the European Network for Earth System Modelling (ENES), is both a strength and a challenge. Recognizing this, the European Union's Program for Integrated Earth System Modelling (PRISM) infrastructure project aims at designing a flexible and friendly user environment to assemble, run and post-process Earth System models. PRISM was started in December 2001 with a duration of three years. This paper presents the major stages of PRISM, including: (1) the definition and promotion of scientific and technical standards to increase component modularity; (2) the development of an end-to-end software environment (graphical user interface, coupling and I/O system, diagnostics, visualization) to launch, monitor and analyse complex Earth system models built around state-of-art community component models (atmosphere, ocean, atmospheric chemistry, ocean bio-chemistry, sea-ice, land-surface); and (3) testing and quality standards to ensure high-performance computing performance on a variety of platforms. PRISM is emerging as a core strategic software infrastructure for building the European research area in Earth system sciences. Copyright (c) 2005 John Wiley & Sons, Ltd.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Background: Molecular tools may help to uncover closely related and still diverging species from a wide variety of taxa and provide insight into the mechanisms, pace and geography of marine speciation. There is a certain controversy on the phylogeography and speciation modes of species-groups with an Eastern Atlantic-Western Indian Ocean distribution, with previous studies suggesting that older events (Miocene) and/or more recent (Pleistocene) oceanographic processes could have influenced the phylogeny of marine taxa. The spiny lobster genus Palinurus allows for testing among speciation hypotheses, since it has a particular distribution with two groups of three species each in the Northeastern Atlantic (P. elephas, P. mauritanicus and P. charlestoni) and Southeastern Atlantic and Southwestern Indian Oceans (P. gilchristi, P. delagoae and P. barbarae). In the present study, we obtain a more complete understanding of the phylogenetic relationships among these species through a combined dataset with both nuclear and mitochondrial markers, by testing alternative hypotheses on both the mutation rate and tree topology under the recently developed approximate Bayesian computation (ABC) methods. Results: Our analyses support a North-to-South speciation pattern in Palinurus with all the South-African species forming a monophyletic clade nested within the Northern Hemisphere species. Coalescent-based ABC methods allowed us to reject the previously proposed hypothesis of a Middle Miocene speciation event related with the closure of the Tethyan Seaway. Instead, divergence times obtained for Palinurus species using the combined mtDNA-microsatellite dataset and standard mutation rates for mtDNA agree with known glaciation-related processes occurring during the last 2 my. Conclusion: The Palinurus speciation pattern is a typical example of a series of rapid speciation events occurring within a group, with very short branches separating different species. Our results support the hypothesis that recent climate change-related oceanographic processes have influenced the phylogeny of marine taxa, with most Palinurus species originating during the last two million years. The present study highlights the value of new coalescent-based statistical methods such as ABC for testing different speciation hypotheses using molecular data.