58 resultados para COMPLEMENTARY EXPONENTIAL GEOMETRIC DISTRIBUTION


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The dynamics of homogeneously heated granular gases which fragment due to particle collisions is analyzed. We introduce a kinetic model which accounts for correlations induced at the grain collisions and analyze both the kinetics and relevant distribution functions these systems develop. The work combines analytical and numerical studies based on direct simulation Monte Carlo calculations. A broad family of fragmentation probabilities is considered, and its implications for the system kinetics are discussed. We show that generically these driven materials evolve asymptotically into a dynamical scaling regime. If the fragmentation probability tends to a constant, the grain number diverges at a finite time, leading to a shattering singularity. If the fragmentation probability vanishes, then the number of grains grows monotonously as a power law. We consider different homogeneous thermostats and show that the kinetics of these systems depends weakly on both the grain inelasticity and driving. We observe that fragmentation plays a relevant role in the shape of the velocity distribution of the particles. When the fragmentation is driven by local stochastic events, the longvelocity tail is essentially exponential independently of the heating frequency and the breaking rule. However, for a Lowe-Andersen thermostat, numerical evidence strongly supports the conjecture that the scaled velocity distribution follows a generalized exponential behavior f (c)~exp (−cⁿ), with n ≈1.2, regarding less the fragmentation mechanisms

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Through the history of Electrical Engineering education, vectorial and phasorial diagrams have been used as a fundamental learning tool. At present, computational power has replaced them by long data lists, the result of solving equation systems by means of numerical methods. In this sense, diagrams have been shifted to an academic background and although theoretically explained, they are not used in a practical way within specific examples. This fact may be against the understanding of the complex behavior of the electrical power systems by students. This article proposes a modification of the classical Perrine-Baum diagram construction to allowing both a more practical representation and a better understanding of the behavior of a high-voltage electric line under different levels of load. This modification allows, at the same time, the forecast of the obsolescence of this behavior and line’s loading capacity. Complementary, we evaluate the impact of this tool in the learning process showing comparative undergraduate results during three academic years

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the present paper we discuss and compare two different energy decomposition schemes: Mayer's Hartree-Fock energy decomposition into diatomic and monoatomic contributions [Chem. Phys. Lett. 382, 265 (2003)], and the Ziegler-Rauk dissociation energy decomposition [Inorg. Chem. 18, 1558 (1979)]. The Ziegler-Rauk scheme is based on a separation of a molecule into fragments, while Mayer's scheme can be used in the cases where a fragmentation of the system in clearly separable parts is not possible. In the Mayer scheme, the density of a free atom is deformed to give the one-atom Mulliken density that subsequently interacts to give rise to the diatomic interaction energy. We give a detailed analysis of the diatomic energy contributions in the Mayer scheme and a close look onto the one-atom Mulliken densities. The Mulliken density ρA has a single large maximum around the nuclear position of the atom A, but exhibits slightly negative values in the vicinity of neighboring atoms. The main connecting point between both analysis schemes is the electrostatic energy. Both decomposition schemes utilize the same electrostatic energy expression, but differ in how fragment densities are defined. In the Mayer scheme, the electrostatic component originates from the interaction of the Mulliken densities, while in the Ziegler-Rauk scheme, the undisturbed fragment densities interact. The values of the electrostatic energy resulting from the two schemes differ significantly but typically have the same order of magnitude. Both methods are useful and complementary since Mayer's decomposition focuses on the energy of the finally formed molecule, whereas the Ziegler-Rauk scheme describes the bond formation starting from undeformed fragment densities

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Several airline consolidation events have recently been completed both in Europe and in the United States. The model we develop considers two airlines operating hub-and-spoke networks, using different hubs to connect the same spoke airports. We assume the airlines to be vertically differentiated, which allows us to distinguish between primary and secondary hubs. We conclude that this differentiation in air services becomes more accentuated after consolidation, with an increased number of flights being channeled through the primary hub. However, congestion can act as a brake on the concentration of flight frequency in the primary hub following consolidation. Our empirical application involves an analysis of Delta s network following its merger with Northwest. We find evidence consistent with an increase in the importance of Delta s primary hubs at the expense of its secondary airports. We also find some evidence suggesting that the carrier chooses to divert traffic away from those hub airports that were more prone to delays prior to the merger, in particular New York s JFK airport. Keywords: primary hub; secondary hub; airport congestion; airline consolidation; airline networks JEL Classi fication Numbers: D43; L13; L40; L93; R4

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Asparagine N-Glycosylation is one of the most important forms of protein post-translational modification in eukaryotes. This metabolic pathway can be subdivided into two parts: an upstream sub-pathway required for achieving proper folding for most of the proteins synthesized in the secretory pathway, and a downstream sub-pathway required to give variability to trans-membrane proteins, and involved in adaptation to the environment andinnate immunity. Here we analyze the nucleotide variability of the genes of this pathway in human populations, identifying which genes show greater population differentiation and which genes show signatures of recent positive selection. We also compare how these signals are distributed between the upstream and the downstream parts of the pathway, with the aim of exploring how forces of population differentiation and positive selection vary among genes involved in the same metabolic pathway but subject to different functional constraints. Results:Our results show that genes in the downstream part of the pathway are more likely to show a signature of population differentiation, while events of positive selection are equally distributed among the two parts of the pathway. Moreover, events of positive selection arefrequent on genes that are known to be at bifurcation points, and that are identified as beingin key position by a network-level analysis such as MGAT3 and GCS1.Conclusions: These findings indicate that the upstream part of the Asparagine N-Glycosylation pathway has lower diversity among populations, while the downstream part is freer to tolerate diversity among populations. Moreover, the distribution of signatures of population differentiation and positive selection can change between parts of a pathway, especially between parts that are exposed to different functional constraints. Our results support the hypothesis that genes involved in constitutive processes can be expected to show lower population differentiation, while genes involved in traits related to the environment should show higher variability. Taken together, this work broadens our knowledge on how events of population differentiation and of positive selection are distributed among different parts of a metabolic pathway.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the past, sensors networks in cities have been limited to fixed sensors, embedded in particular locations, under centralised control. Today, new applications can leverage wireless devices and use them as sensors to create aggregated information. In this paper, we show that the emerging patterns unveiled through the analysis of large sets of aggregated digital footprints can provide novel insights into how people experience the city and into some of the drivers behind these emerging patterns. We particularly explore the capacity to quantify the evolution of the attractiveness of urban space with a case study of in the area of the New York City Waterfalls, a public art project of four man-made waterfalls rising from the New York Harbor. Methods to study the impact of an event of this nature are traditionally based on the collection of static information such as surveys and ticket-based people counts, which allow to generate estimates about visitors’ presence in specific areas over time. In contrast, our contribution makes use of the dynamic data that visitors generate, such as the density and distribution of aggregate phone calls and photos taken in different areas of interest and over time. Our analysis provides novel ways to quantify the impact of a public event on the distribution of visitors and on the evolution of the attractiveness of the points of interest in proximity. This information has potential uses for local authorities, researchers, as well as service providers such as mobile network operators.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this work we describe the usage of bilinear statistical models as a means of factoring the shape variability into two components attributed to inter-subject variation and to the intrinsic dynamics of the human heart. We show that it is feasible to reconstruct the shape of the heart at discrete points in the cardiac cycle. Provided we are given a small number of shape instances representing the same heart atdifferent points in the same cycle, we can use the bilinearmodel to establish this. Using a temporal and a spatial alignment step in the preprocessing of the shapes, around half of the reconstruction errors were on the order of the axial image resolution of 2 mm, and over 90% was within 3.5 mm. From this, weconclude that the dynamics were indeed separated from theinter-subject variability in our dataset.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introducción y objetivos. Se ha señalado que, en la miocardiopatía hipertrófica (MCH), la desorganización de las fibras regionales da lugar a segmentos en los que la deformación es nula o está gravemente reducida, y que estos segmentos tienen una distribución no uniforme en el ventrículo izquierdo (VI). Esto contrasta con lo observado en otros tipos de hipertrofia como en el corazón de atleta o la hipertrofia ventricular izquierda hipertensiva (HVI-HT), en los que puede haber una deformación cardiaca anormal, pero nunca tan reducida como para que se observe ausencia de deformación. Así pues, proponemos el empleo de la distribución de los valores de strain para estudiar la deformación en la MCH. Métodos. Con el empleo de resonancia magnética marcada (tagged), reconstruimos la deformación sistólica del VI de 12 sujetos de control, 10 atletas, 12 pacientes con MCH y 10 pacientes con HVI-HT. La deformación se cuantificó con un algoritmo de registro no rígido y determinando los valores de strain sistólico máximo radial y circunferencial en 16 segmentos del VI. Resultados. Los pacientes con MCH presentaron unos valores medios de strain significativamente inferiores a los de los demás grupos. Sin embargo, aunque la deformación observada en los individuos sanos y en los pacientes con HVI-HT se concentraba alrededor del valor medio, en la MCH coexistían segmentos con contracción normal y segmentos con una deformación nula o significativamente reducida, con lo que se producía una mayor heterogeneidad de los valores de strain. Se observaron también algunos segmentos sin deformación incluso en ausencia de fibrosis o hipertrofia. Conclusiones. La distribución de strain caracteriza los patrones específicos de deformación miocárdica en pacientes con diferentes etiologías de la HVI. Los pacientes con MCH presentaron un valor medio de strain significativamente inferior, así como una mayor heterogeneidad de strain (en comparación con los controles, los atletas y los pacientes con HVI-HT), y tenían regiones sin deformación.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In a distributed key distribution scheme, a set of servers helps a set of users in a group to securely obtain a common key. Security means that an adversary who corrupts some servers and some users has no information about the key of a noncorrupted group. In this work, we formalize the security analysis of one such scheme which was not considered in the original proposal. We prove the scheme is secure in the random oracle model, assuming that the Decisional Diffie-Hellman (DDH) problem is hard to solve. We also detail a possible modification of that scheme and the one in which allows us to prove the security of the schemes without assuming that a specific hash function behaves as a random oracle. As usual, this improvement in the security of the schemes is at the cost of an efficiency loss.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Automatic classification of makams from symbolic data is a rarely studied topic. In this paper, first a review of an n-gram based approach is presented using various representations of the symbolic data. While a high degree of precision can be obtained, confusion happens mainly for makams using (almost) the same scale and pitch hierarchy but differ in overall melodic progression, seyir. To further improve the system, first n-gram based classification is tested for various sections of the piece to take into account a feature of the seyir that melodic progression starts in a certain region of the scale. In a second test, a hierarchical classification structure is designed which uses n-grams and seyir features in different levels to further improve the system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Individuals' life chances in the future will very much depend on how we invest in our children now. An optimal human capital model would combine a high mean with minimal variance of skills. It is well-established that early childhood learning is key to adult success. The impact of social origins on child outcomes remains strong, and the new role of women poses additional challenges to our conventional nurturing approach to child development. This paper focuses on skill development in the early years, examining how we might best combine family inputs and public policy to invest optimally in our future human capital. I emphasize three issues: one, the uneven capacity of parents to invest in children; two, the impact of mothers' employment on child outcomes; and three, the potential benefits of early pre-school programmes. I conclude that mothers' intra-family bargaining power is decisive for family investments and that universal child care is key if our goal is to arrive at a strong mean with minimal variance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Aitchison vector space structure for the simplex is generalized to a Hilbert space structure A2(P) for distributions and likelihoods on arbitrary spaces. Centralnotations of statistics, such as Information or Likelihood, can be identified in the algebraical structure of A2(P) and their corresponding notions in compositional data analysis, such as Aitchison distance or centered log ratio transform.In this way very elaborated aspects of mathematical statistics can be understoodeasily in the light of a simple vector space structure and of compositional data analysis. E.g. combination of statistical information such as Bayesian updating,combination of likelihood and robust M-estimation functions are simple additions/perturbations in A2(Pprior). Weighting observations corresponds to a weightedaddition of the corresponding evidence.Likelihood based statistics for general exponential families turns out to have aparticularly easy interpretation in terms of A2(P). Regular exponential families formfinite dimensional linear subspaces of A2(P) and they correspond to finite dimensionalsubspaces formed by their posterior in the dual information space A2(Pprior).The Aitchison norm can identified with mean Fisher information. The closing constant itself is identified with a generalization of the cummulant function and shown to be Kullback Leiblers directed information. Fisher information is the local geometry of the manifold induced by the A2(P) derivative of the Kullback Leibler information and the space A2(P) can therefore be seen as the tangential geometry of statistical inference at the distribution P.The discussion of A2(P) valued random variables, such as estimation functionsor likelihoods, give a further interpretation of Fisher information as the expected squared norm of evidence and a scale free understanding of unbiased reasoning

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We estimate the world distribution of income by integrating individualincome distributions for 125 countries between 1970 and 1998. Weestimate poverty rates and headcounts by integrating the density functionbelow the $1/day and $2/day poverty lines. We find that poverty ratesdecline substantially over the last twenty years. We compute povertyheadcounts and find that the number of one-dollar poor declined by 235million between 1976 and 1998. The number of $2/day poor declined by 450million over the same period. We analyze poverty across different regionsand countries. Asia is a great success, especially after 1980. LatinAmerica reduced poverty substantially in the 1970s but progress stoppedin the 1980s and 1990s. The worst performer was Africa, where povertyrates increased substantially over the last thirty years: the number of$1/day poor in Africa increased by 175 million between 1970 and 1998,and the number of $2/day poor increased by 227. Africa hosted 11% ofthe world s poor in 1960. It hosted 66% of them in 1998. We estimatenine indexes of income inequality implied by our world distribution ofincome. All of them show substantial reductions in global incomeinequality during the 1980s and 1990s.