62 resultados para Exponential Sum


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The recent strides of democracy in Latin America have been associated to conflicting outcomes. The expectation that democracy would bring about peace and prosperity have been only partly satisfied. While political violence has been by and large eradicated from the sub-continent, poverty and social injustice still prevail and hold sway. Our study argues that democracy matters for inequality through the growing strength of center left and left parties and by making political leaders in general more responsive to the underprivileged. Furthermore, although the pension reforms recently enacted in the region generated overall regressive outcomes on income distribution, democratic countries still benefit from their political past: where democratic tradition was stronger, such outcomes have been milder. Democratic tradition and the specific ideological connotations of the parties in power, on the other hand, did not play an equally crucial role in securing lower levels of political violence: during the last wave of democratizations in Latin America, domestic peace was rather an outcome of political and social concessions to those in distress. In sum, together with other factors and especially economic ones, the reason why recent democratizations have provided domestic peace in most cases, but have been unable so far to solve the problem of poverty and inequality, is that democratic traditions in the subcontinent have been relatively weak and, more specifically, that this weakness has undermined the growth of left and progressive parties, acting as an obstacle to redistribution. Such weakness, on the other hand, has not prevented the drastic reduction of domestic political violence, since what mattered in this case was a combination of symbolic or material concessions and political agreements among powerful élites and counter-élites.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We prove a formula for the multiplicities of the index of an equivariant transversally elliptic operator on a G-manifold. The formula is a sum of integrals over blowups of the strata of the group action and also involves eta invariants of associated elliptic operators. Among the applications, we obtain an index formula for basic Dirac operators on Riemannian foliations, a problem that was open for many years.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper is concerned with the investigation of the intergenerational mobility of education in several European countries and its changes across birth cohorts (1940-1980) using a new mobility index that considers the total degree of mobility as the weighted sum of mobility with respect to both parents. Moreover, this mobility index enables the analysis of the role of family characteristics as mediating factors in the statistical association between individual and parental education. We find that Nordic countries display lower levels of educational persistence but that the degree of mobility increases over time only in those countries with low initial levels. Moreover, the results suggest that the degree of mobility with respect to fathers and mothers converges to the same level and that family characteristics account for an important part of the statistical association between parental education and children’s schooling; a particular finding is that the most important elements of family characteristics are the family’s socio-economic status and educational assortative mating of the parents.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of this paper is to discover the origins of utility regulation in Spain, and to analyse, from a microeconomic perspective, its characteristics and the impact of regulation on consumers and utilities. Madrid and the Madrilenian utilities are taken as a case study. The electric industry in the period studied was a natural monopoly2. Each of the three phases of production, generation, transmission and distribution, had natural monopoly characteristics. Therefore, the most efficient form to generate, transmit and distribute electricity was the monopoly because one firm can produce a quantity at a lower cost than the sum of costs incurred by two or more firms. A problem arises because when a firm is the single provider it can charge prices above the marginal cost, at monopoly prices. When a monopolist reduces the quantity produced, price increases, causing the consumer to demand less than the economic efficiency level, incurring a loss of consumer surplus. The loss of the consumer surplus is not completely gained by the monopolist, causing a loss of social surplus, a deadweight loss. The main objective of regulation is going to be to reduce to a minimum the deadweight loss. Regulation is also needed because when the monopolist fixes prices at marginal cost equal marginal revenue there would be an incentive for firms to enter the market creating inefficiency. The Madrilenian industry has been chosen because of the availability of statistical information on costs and production. The complex industry structure and the atomised demand add interest to the analysis. This study will also provide some light on the tariff regulation of the period which has been poorly studied and will complement the literature on the US electric utilities regulation where a different type of regulation was implemented.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We prove a formula for the multiplicities of the index of an equivariant transversally elliptic operator on a G-manifold. The formula is a sum of integrals over blowups of the strata of the group action and also involves eta invariants of associated elliptic operators. Among the applications, we obtain an index formula for basic Dirac operators on Riemannian foliations, a problem that was open for many years.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we prove a formula for the analytic index of a basic Dirac-type operator on a Riemannian foliation, solving a problem that has been open for many years. We also consider more general indices given by twisting the basic Dirac operator by a representation of the orthogonal group. The formula is a sum of integrals over blowups of the strata of the foliation and also involves eta invariants of associated elliptic operators. As a special case, a Gauss-Bonnet formula for the basic Euler characteristic is obtained using two independent proofs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present sharpened lower bounds on the size of cut free proofs for first-order logic. Prior lower bounds for eliminating cuts from a proof established superexponential lower bounds as a stack of exponentials, with the height of the stack proportional to the maximum depth d of the formulas in the original proof. Our new lower bounds remove the constant of proportionality, giving an exponential stack of height equal to d − O(1). The proof method is based on more efficiently expressing the Gentzen-Solovay cut formulas as low depth formulas.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Hypergraph width measures are a class of hypergraph invariants important in studying the complexity of constraint satisfaction problems (CSPs). We present a general exact exponential algorithm for a large variety of these measures. A connection between these and tree decompositions is established. This enables us to almost seamlessly adapt the combinatorial and algorithmic results known for tree decompositions of graphs to the case of hypergraphs and obtain fast exact algorithms. As a consequence, we provide algorithms which, given a hypergraph H on n vertices and m hyperedges, compute the generalized hypertree-width of H in time O*(2n) and compute the fractional hypertree-width of H in time O(1.734601n.m).1

Relevância:

10.00% 10.00%

Publicador:

Resumo:

To measure the contribution of individual transactions inside the total risk of a credit portfolio is a major issue in financial institutions. VaR Contributions (VaRC) and Expected Shortfall Contributions (ESC) have become two popular ways of quantifying the risks. However, the usual Monte Carlo (MC) approach is known to be a very time consuming method for computing these risk contributions. In this paper we consider the Wavelet Approximation (WA) method for Value at Risk (VaR) computation presented in [Mas10] in order to calculate the Expected Shortfall (ES) and the risk contributions under the Vasicek one-factor model framework. We decompose the VaR and the ES as a sum of sensitivities representing the marginal impact on the total portfolio risk. Moreover, we present technical improvements in the Wavelet Approximation (WA) that considerably reduce the computational effort in the approximation while, at the same time, the accuracy increases.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Report for the scientific sojourn carried out at the Darmouth College, from august 2007 until february 2008. It has been very successful, from different viewpoints: scientific, philosophical, human. We have definitely advanced, during the past six months, towards the comprehension of the behaviour of the fluctuations of the quantum vacuum in the presence of boundaries, moving and non-moving, and also in situations where the topology of space-time changes: the dynamical Casimir effect, regularization problems, particle creation statistics, according to different BC, etc. We have solved some longstanding problems and got in this subject quite remarkable results (as we will explain in more detail below). We also pursued a general approach towards a viable modified f(R) gravity in both the Jordan and the Einstein frames (which are known to be mathematically equivalent, but physically not so). A class of exponential, realistic modified gravities has been introduced by us and investigated with care. Special focus was made on step-class models, most promising from the phenomenological viewpoint and which provide a natural way to classify all viable modified gravities. One- and two-steps models were considered, but the analysis is extensible to N-step models. Both inflation in the early universe and the onset of recent accelerated expansion arise in these models in a natural, unified way, what makes them very promising. Moreover, it is monstrated in our work that models in this category easily pass all local tests, including stability of spherical body solution, non-violation of Newton's law, and generation of a very heavy positive mass for the additional scalar degree of freedom.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Desde el inicio del proyecto del genoma humano y su éxito en el año 2001 se han secuenciado genomas de multitud de especies. La mejora en las tecnologías de secuenciación ha generado volúmenes de datos con un crecimiento exponencial. El proyecto Análisis bioinformáticos sobre la tecnología Hadoop abarca la computación paralela de datos biológicos como son las secuencias de ADN. El estudio ha sido encauzado por la naturaleza del problema a resolver. El alineamiento de secuencias genéticas con el paradigma MapReduce.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Tropical cyclones are affected by a large number of climatic factors, which translates into complex patterns of occurrence. The variability of annual metrics of tropical-cyclone activity has been intensively studied, in particular since the sudden activation of the North Atlantic in the mid 1990’s. We provide first a swift overview on previous work by diverse authors about these annual metrics for the North-Atlantic basin, where the natural variability of the phenomenon, the existence of trends, the drawbacks of the records, and the influence of global warming have been the subject of interesting debates. Next, we present an alternative approach that does not focus on seasonal features but on the characteristics of single events [Corral et al., Nature Phys. 6, 693 (2010)]. It is argued that the individual-storm power dissipation index (PDI) constitutes a natural way to describe each event, and further, that the PDI statistics yields a robust law for the occurrence of tropical cyclones in terms of a power law. In this context, methods of fitting these distributions are discussed. As an important extension to this work we introduce a distribution function that models the whole range of the PDI density (excluding incompleteness effects at the smallest values), the gamma distribution, consisting in a powerlaw with an exponential decay at the tail. The characteristic scale of this decay, represented by the cutoff parameter, provides very valuable information on the finiteness size of the basin, via the largest values of the PDIs that the basin can sustain. We use the gamma fit to evaluate the influence of sea surface temperature (SST) on the occurrence of extreme PDI values, for which we find an increase around 50 % in the values of these basin-wide events for a 0.49 C SST average difference. Similar findings are observed for the effects of the positive phase of the Atlantic multidecadal oscillation and the number of hurricanes in a season on the PDI distribution. In the case of the El Niño Southern oscillation (ENSO), positive and negative values of the multivariate ENSO index do not have a significant effect on the PDI distribution; however, when only extreme values of the index are used, it is found that the presence of El Niño decreases the PDI of the most extreme hurricanes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

El objetivo principal de este artículo es la selección y comparación de dos herramientas de análisis estático para java, esta tarea necesita de estudiar previamente el estado del arte de estos analizadores, ver qué características son deseables para este tipo de analizadores y finalmente compararlas en ejecución sobre los dos proyectos de software libre elegidos argoUML y openProj. Se compara FindBugs con PMD, dos analizadores que pueden utilizarse con la versión 1.6. de JDK. Los resultados de la comparación nos permiten deducir que los analizadores se complementan en cuanto a bugs detectados, hay pocos solapamientos. Como conclusiones podemos decir que la búsqueda de bugs necesita de más de una herramienta de análisis estático.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

MELIBEA és un directori i validador de polítiques en favor de l'accés obert a la producció científico-acadèmica. Com a directori, descriu les polítiques institucionals existents relacionades amb l'accés obert (AO) a la producció científica i acadèmica. Com a validador, les sotmet a una anàlisi qualitatiu i quantitatiu basat en el compliment d'un conjunt d'indicadors que reflecteixen les bases en què es fonamenta una política institucional. El validador indica una puntuació i un percentatge de compliment per a cada una de les polítiques analitzades. Això es realitza a partir dels valors assignats a certs indicadors i la seva ponderació en funció de la seva importància relativa. La suma dels valors ponderats de cadascun dels indicadors s'ajusta a una escala percentual i condueix al que hem anomenat "Percentatge validat d'accés obert", el càlcul del qual s'exposa en l'apartat de Metodologia. Els tipus d'institucions que s'analitzen són universitats, centres de recerca, agències finançadores i organitzacions governamentals.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Emergent molecular measurement methods, such as DNA microarray, qRTPCR, andmany others, offer tremendous promise for the personalized treatment of cancer. Thesetechnologies measure the amount of specific proteins, RNA, DNA or other moleculartargets from tumor specimens with the goal of “fingerprinting” individual cancers. Tumorspecimens are heterogeneous; an individual specimen typically contains unknownamounts of multiple tissues types. Thus, the measured molecular concentrations resultfrom an unknown mixture of tissue types, and must be normalized to account for thecomposition of the mixture.For example, a breast tumor biopsy may contain normal, dysplastic and cancerousepithelial cells, as well as stromal components (fatty and connective tissue) and bloodand lymphatic vessels. Our diagnostic interest focuses solely on the dysplastic andcancerous epithelial cells. The remaining tissue components serve to “contaminate”the signal of interest. The proportion of each of the tissue components changes asa function of patient characteristics (e.g., age), and varies spatially across the tumorregion. Because each of the tissue components produces a different molecular signature,and the amount of each tissue type is specimen dependent, we must estimate the tissuecomposition of the specimen, and adjust the molecular signal for this composition.Using the idea of a chemical mass balance, we consider the total measured concentrationsto be a weighted sum of the individual tissue signatures, where weightsare determined by the relative amounts of the different tissue types. We develop acompositional source apportionment model to estimate the relative amounts of tissuecomponents in a tumor specimen. We then use these estimates to infer the tissuespecificconcentrations of key molecular targets for sub-typing individual tumors. Weanticipate these specific measurements will greatly improve our ability to discriminatebetween different classes of tumors, and allow more precise matching of each patient tothe appropriate treatment