554 resultados para Distribució (Teoria de la probabilitat)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Western societies can reduce avoidable mortality and morbidity by better understanding the relationship between obesity and chronic disease. This paper examines the joint determinants of obesity and of heart disease, diabetes, hypertension, and elevated cholesterol. It analyzes a broadly representative Spanish dataset, the 1999 Survey on Disabilities, Impairments and Health Status, using a health production theoretical framework together with a seemingly unrelated probit model approach that controls for unobserved heterogeneity and endogeneity. Its findings provide suggestive evidence of a positive and significant, although specification-dependent, association between obesity and the prevalence of chronic illness

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper investigates the contribution of public investment to the reduction of regional inqualities, with a specific application to Mexico. We use quantile regressions to examine the impact of public investment on regional disparities according to the position of each region in the conditional distribution of regional income. Results confirm the hypothesis that regional inequalities can indeed be atrributed to the regional distribution of public investment, where the observed pattern shows that public investment mainly helped to reduce regional inequalities between the richest regions

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the quest to completely describe entanglement in the general case of a finite number of parties sharing a physical system of finite-dimensional Hilbert space an entanglement magnitude is introduced for its pure and mixed states: robustness. It corresponds to the minimal amount of mixing with locally prepared states which washes out all entanglement. It quantifies in a sense the endurance of entanglement against noise and jamming. Its properties are studied comprehensively. Analytical expressions for the robustness are given for pure states of two-party systems, and analytical bounds for mixed states of two-party systems. Specific results are obtained mainly for the qubit-qubit system (qubit denotes quantum bit). As by-products local pseudomixtures are generalized, a lower bound for the relative volume of separable states is deduced, and arguments for considering convexity a necessary condition of any entanglement measure are put forward.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Positive-operator-valued measurements on a finite number of N identically prepared systems of arbitrary spin J are discussed. Pure states are characterized in terms of Bloch-like vectors restricted by a SU(2J+1) covariant constraint. This representation allows for a simple description of the equations to be fulfilled by optimal measurements. We explicitly find the minimal positive-operator-valued measurement for the N=2 case, a rigorous bound for N=3, and set up the analysis for arbitrary N.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present optimal measuring strategies for an estimation of the entanglement of unknown two-qubit pure states and of the degree of mixing of unknown single-qubit mixed states, of which N identical copies are available. The most general measuring strategies are considered in both situations, to conclude in the first case that a local, although collective, measurement suffices to estimate entanglement, a nonlocal property, optimally.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The decay of orthopositronium into three photons produces a physical realization of a pure state with three-party entanglement. Its quantum correlations are analyzed using recent results on quantum information theory, looking for the final state that has the maximal amount of Greenberger, Horne, and Zeilinger like correlations. This state allows for a statistical dismissal of local realism stronger than the one obtained using any entangled state of two spin one-half particles.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

J/psi photoproduction is studied in the framework of the analytic S-matrix theory. The differential and integrated elastic cross sections for J/psi photoproduction are calculated from a dual amplitude with Mandelstam analyticity. It is argued that, at low energies, the background, which is the low-energy equivalent of the high-energy diffraction, replaces the Pomeron exchange. The onset of the high-energy Pomeron dominance is estimated from the fits to the data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recently a new Bell inequality has been introduced by Collins et al. [Phys. Rev. Lett. 88, 040404 (2002)], which is strongly resistant to noise for maximally entangled states of two d-dimensional quantum systems. We prove that a larger violation, or equivalently a stronger resistance to noise, is found for a nonmaximally entangled state. It is shown that the resistance to noise is not a good measure of nonlocality and we introduce some other possible measures. The nonmaximally entangled state turns out to be more robust also for these alternative measures. From these results it follows that two von Neumann measurements per party may be not optimal for detecting nonlocality. For d=3,4, we point out some connections between this inequality and distillability. Indeed, we demonstrate that any state violating it, with the optimal von Neumann settings, is distillable.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We apply majorization theory to study the quantum algorithms known so far and find that there is a majorization principle underlying the way they operate. Grover's algorithm is a neat instance of this principle where majorization works step by step until the optimal target state is found. Extensions of this situation are also found in algorithms based in quantum adiabatic evolution and the family of quantum phase-estimation algorithms, including Shor's algorithm. We state that in quantum algorithms the time arrow is a majorization arrow.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Durant el segle XIX, l'economia espanyola va transitar per les primeres etapes de la industrialització. Aquest procés es va donar en paral·lel a la integració del mercat domèstic de béns i factors, en un moment en què les reformes liberals i la construcció de la xarxa ferroviària, entre d'altres, van generar una important caiguda en els costos detransport. Al mateix temps que es donava aquesta progressiva integració del mercat domèstic espanyol, es van produir canvis significatius en la pauta de localització industrial. D'una banda, hi hagué un augment considerable de la concentració espacial de la indústria des de mitjans de segle XIX i fins a la Guerra Civil, i d¿altra, un increment de l'especialització regional. Ara bé, quines van ser les forces que van generar aquests canvis? Des d¿un punt de vista teòric, el model de Heckscher-Ohlin suggereix que la distribució a l'espai de l¿activitat econòmica ve determinada per l'avantatge comparativa dels territoris en funció de la dotació relativa de factors. Al seu torn, els models de Nova Geografia Econòmica (NEG) mostren l'existència d'una relació en forma de campana entre el procés d'integració econòmica i el grau de concentració geogràfica de l'activitat industrial. Aquest article examina empíricament els determinants de la localització industrial a Espanya entre 1856 i 1929, mitjançant l'estimació d¿un model que combina els elements de tipus Heckscher-Ohlin i els factors apuntats des de la NEG, amb l'objectiu de contrastar la força relativa dels arguments vinculats a aquestes dues interpretacions a l'hora de modular la localització de la indústria a Espanya. L'anàlisi dels resultats obtinguts mostra que tant la dotació de factors com els mecanismes de tipus NEG van ser elements determinants que expliquen la distribució geogràfica de la indústria des del segle XIX, tot i que la seva força relativa va anar variant amb el temps.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a heuristic method for learning error correcting output codes matrices based on a hierarchical partition of the class space that maximizes a discriminative criterion. To achieve this goal, the optimal codeword separation is sacrificed in favor of a maximum class discrimination in the partitions. The creation of the hierarchical partition set is performed using a binary tree. As a result, a compact matrix with high discrimination power is obtained. Our method is validated using the UCI database and applied to a real problem, the classification of traffic sign images.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Given a compact pseudo-metric space, we associate to it upper and lower dimensions, depending only on the pseudo-metric. Then we construct a doubling measure for which the measure of a dilated ball is closely related to these dimensions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

En este trabajo se presenta una aplicación empírica del modelo de Hull-White (2000) al mercado de renta fija español. Este modelo proporciona la expresión por el cálculo de los pagos hechos por el comprador de un credit default swap (CDS), bajo la hipótesis de que no existe riesgo de contrapartida. Se supone, además, que la curva cupón cero, la tasa de recuperación constante y el momento del suceso de crédito son independientes. Se utilizan bonos del Banco Santander Central Hispano para mesurar la probabilidad neutra al riesgo de quiebra y, bajo hipótesis de no arbitraje, se calculan las primas de un CDS, por un bono subyacente con la misma calificación crediticia que la entidad de referencia. Se observa que las primas se ajustan bien a los spreads crediticios del mercado, que se acostumbran a utilizar como alternativa a las mismas.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this work the valuation methodology of compound option written on a downand-out call option, developed by Ericsson and Reneby (2003), has been applied to deduce a credit risk model. It is supposed that the firm has a debt structure with two maturity dates and that the credit event takes place when the assets firm value falls under a determined level called barrier. An empirical application of the model for 105 firms of Spanish continuous market is carried out. For each one of them its value in the date of analysis, the volatility and the critical value are obtained and from these, the default probability to short and long-term and the implicit probability in the two previous probabilities are deduced. The results are compared with the ones obtained from the Geskemodel (1977).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Western societies can reduce avoidable mortality and morbidity by better understanding the relationship between obesity and chronic disease. This paper examines the joint determinants of obesity and of heart disease, diabetes, hypertension, and elevated cholesterol. It analyzes a broadly representative Spanish dataset, the 1999 Survey on Disabilities, Impairments and Health Status, using a health production theoretical framework together with a seemingly unrelated probit model approach that controls for unobserved heterogeneity and endogeneity. Its findings provide suggestive evidence of a positive and significant, although specification-dependent, association between obesity and the prevalence of chronic illness