919 resultados para power-law graph


Relevância:

80.00% 80.00%

Publicador:

Resumo:

GitHub is the most popular repository for open source code (Finley 2011). It has more than 3.5 million users, as the company declared in April 2013, and more than 10 million repositories, as of December 2013. It has a publicly accessible API and, since March 2012, it also publishes a stream of all the events occurring on public projects. Interactions among GitHub users are of a complex nature and take place in different forms. Developers create and fork repositories, push code, approve code pushed by others, bookmark their favorite projects and follow other developers to keep track of their activities. In this paper we present a characterization of GitHub, as both a social network and a collaborative platform. To the best of our knowledge, this is the first quantitative study about the interactions happening on GitHub. We analyze the logs from the service over 18 months (between March 11, 2012 and September 11, 2013), describing 183.54 million events and we obtain information about 2.19 million users and 5.68 million repositories, both growing linearly in time. We show that the distributions of the number of contributors per project, watchers per project and followers per user show a power-law-like shape. We analyze social ties and repository-mediated collaboration patterns, and we observe a remarkably low level of reciprocity of the social connections. We also measure the activity of each user in terms of authored events and we observe that very active users do not necessarily have a large number of followers. Finally, we provide a geographic characterization of the centers of activity and we investigate how distance influences collaboration.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In studies of complex heterogeneous networks, particularly of the Internet, significant attention was paid to analysing network failures caused by hardware faults or overload. There network reaction was modelled as rerouting of traffic away from failed or congested elements. Here we model network reaction to congestion on much shorter time scales when the input traffic rate through congested routes is reduced. As an example we consider the Internet where local mismatch between demand and capacity results in traffic losses. We describe the onset of congestion as a phase transition characterised by strong, albeit relatively short-lived, fluctuations of losses caused by noise in input traffic and exacerbated by the heterogeneous nature of the network manifested in a power-law load distribution. The fluctuations may result in the network strongly overreacting to the first signs of congestion by significantly reducing input traffic along the communication paths where congestion is utterly negligible. © 2013 IEEE.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Previous work has shown that human vision performs spatial integration of luminance contrast energy, where signals are squared and summed (with internal noise) over area at detection threshold. We tested that model here in an experiment using arrays of micro-pattern textures that varied in overall stimulus area and sparseness of their target elements, where the contrast of each element was normalised for sensitivity across the visual field. We found a power-law improvement in performance with stimulus area, and a decrease in sensitivity with sparseness. While the contrast integrator model performed well when target elements constituted 50–100% of the target area (replicating previous results), observers outperformed the model when texture elements were sparser than this. This result required the inclusion of further templates in our model, selective for grids of various regular texture densities. By assuming a MAX operation across these noisy mechanisms the model also accounted for the increase in the slope of the psychometric function that occurred as texture density decreased. Thus, for the first time, mechanisms that are selective for texture density have been revealed at contrast detection threshold. We suggest that these mechanisms have a role to play in the perception of visual textures.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this dissertation, I investigate three related topics on asset pricing: the consumption-based asset pricing under long-run risks and fat tails, the pricing of VIX (CBOE Volatility Index) options and the market price of risk embedded in stock returns and stock options. These three topics are fully explored in Chapter II through IV. Chapter V summarizes the main conclusions. In Chapter II, I explore the effects of fat tails on the equilibrium implications of the long run risks model of asset pricing by introducing innovations with dampened power law to consumption and dividends growth processes. I estimate the structural parameters of the proposed model by maximum likelihood. I find that the stochastic volatility model with fat tails can, without resorting to high risk aversion, generate implied risk premium, expected risk free rate and their volatilities comparable to the magnitudes observed in data. In Chapter III, I examine the pricing performance of VIX option models. The contention that simpler-is-better is supported by the empirical evidence using actual VIX option market data. I find that no model has small pricing errors over the entire range of strike prices and times to expiration. In general, Whaley’s Black-like option model produces the best overall results, supporting the simpler-is-better contention. However, the Whaley model does under/overprice out-of-the-money call/put VIX options, which is contrary to the behavior of stock index option pricing models. In Chapter IV, I explore risk pricing through a model of time-changed Lvy processes based on the joint evidence from individual stock options and underlying stocks. I specify a pricing kernel that prices idiosyncratic and systematic risks. This approach to examining risk premia on stocks deviates from existing studies. The empirical results show that the market pays positive premia for idiosyncratic and market jump-diffusion risk, and idiosyncratic volatility risk. However, there is no consensus on the premium for market volatility risk. It can be positive or negative. The positive premium on idiosyncratic risk runs contrary to the implications of traditional capital asset pricing theory.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this dissertation, I investigate three related topics on asset pricing: the consumption-based asset pricing under long-run risks and fat tails, the pricing of VIX (CBOE Volatility Index) options and the market price of risk embedded in stock returns and stock options. These three topics are fully explored in Chapter II through IV. Chapter V summarizes the main conclusions. In Chapter II, I explore the effects of fat tails on the equilibrium implications of the long run risks model of asset pricing by introducing innovations with dampened power law to consumption and dividends growth processes. I estimate the structural parameters of the proposed model by maximum likelihood. I find that the stochastic volatility model with fat tails can, without resorting to high risk aversion, generate implied risk premium, expected risk free rate and their volatilities comparable to the magnitudes observed in data. In Chapter III, I examine the pricing performance of VIX option models. The contention that simpler-is-better is supported by the empirical evidence using actual VIX option market data. I find that no model has small pricing errors over the entire range of strike prices and times to expiration. In general, Whaley’s Black-like option model produces the best overall results, supporting the simpler-is-better contention. However, the Whaley model does under/overprice out-of-the-money call/put VIX options, which is contrary to the behavior of stock index option pricing models. In Chapter IV, I explore risk pricing through a model of time-changed Lévy processes based on the joint evidence from individual stock options and underlying stocks. I specify a pricing kernel that prices idiosyncratic and systematic risks. This approach to examining risk premia on stocks deviates from existing studies. The empirical results show that the market pays positive premia for idiosyncratic and market jump-diffusion risk, and idiosyncratic volatility risk. However, there is no consensus on the premium for market volatility risk. It can be positive or negative. The positive premium on idiosyncratic risk runs contrary to the implications of traditional capital asset pricing theory.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Some current meter data obtained from a mooring at 2450 m water depth near the continental slope off Portugal are presented. The mean currents at five levels with observations are northward. Mean speeds in the core of the Mediterranean Water exceed speeds at shallower levels by 2 to 3 cm/sec, indicating advection connected to this specific water mass. The current variability is dominated by semi-diurnal tidal components. Normal mode analysis reveals a predominant mode of order 2, representing 48% of the total kinetic tidal energy. Results for the barotropic tidal component are in good agreement with earlier predictions for this area. The motion at higher frequencies w in the internal gravity wave band can be well described by a w**-2 power law for the energy density spectrum. This result is consistent with earlier observations in other parts of the ocean.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The metabolic rate of organisms may either be viewed as a basic property from which other vital rates and many ecological patterns emerge and that follows a universal allometric mass scaling law; or it may be considered a property of the organism that emerges as a result of the organism's adaptation to the environment, with consequently less universal mass scaling properties. Data on body mass, maximum ingestion and clearance rates, respiration rates and maximum growth rates of animals living in the ocean epipelagic were compiled from the literature, mainly from original papers but also from previous compilations by other authors. Data were read from tables or digitized from graphs. Only measurements made on individuals of know size, or groups of individuals of similar and known size were included. We show that clearance and respiration rates have life-form-dependent allometries that have similar scaling but different elevations, such that the mass-specific rates converge on a rather narrow size-independent range. In contrast, ingestion and growth rates follow a near-universal taxa-independent ~3/4 mass scaling power law. We argue that the declining mass-specific clearance rates with size within taxa is related to the inherent decrease in feeding efficiency of any particular feeding mode. The transitions between feeding mode and simultaneous transitions in clearance and respiration rates may then represent adaptations to the food environment and be the result of the optimization of tradeoffs that allow sufficient feeding and growth rates to balance mortality.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The metabolic rate of organisms may either be viewed as a basic property from which other vital rates and many ecological patterns emerge and that follows a universal allometric mass scaling law; or it may be considered a property of the organism that emerges as a result of the organism's adaptation to the environment, with consequently less universal mass scaling properties. Data on body mass, maximum ingestion and clearance rates, respiration rates and maximum growth rates of animals living in the ocean epipelagic were compiled from the literature, mainly from original papers but also from previous compilations by other authors. Data were read from tables or digitized from graphs. Only measurements made on individuals of know size, or groups of individuals of similar and known size were included. We show that clearance and respiration rates have life-form-dependent allometries that have similar scaling but different elevations, such that the mass-specific rates converge on a rather narrow size-independent range. In contrast, ingestion and growth rates follow a near-universal taxa-independent ~3/4 mass scaling power law. We argue that the declining mass-specific clearance rates with size within taxa is related to the inherent decrease in feeding efficiency of any particular feeding mode. The transitions between feeding mode and simultaneous transitions in clearance and respiration rates may then represent adaptations to the food environment and be the result of the optimization of tradeoffs that allow sufficient feeding and growth rates to balance mortality.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The metabolic rate of organisms may either be viewed as a basic property from which other vital rates and many ecological patterns emerge and that follows a universal allometric mass scaling law; or it may be considered a property of the organism that emerges as a result of the organism's adaptation to the environment, with consequently less universal mass scaling properties. Data on body mass, maximum ingestion and clearance rates, respiration rates and maximum growth rates of animals living in the ocean epipelagic were compiled from the literature, mainly from original papers but also from previous compilations by other authors. Data were read from tables or digitized from graphs. Only measurements made on individuals of know size, or groups of individuals of similar and known size were included. We show that clearance and respiration rates have life-form-dependent allometries that have similar scaling but different elevations, such that the mass-specific rates converge on a rather narrow size-independent range. In contrast, ingestion and growth rates follow a near-universal taxa-independent ~3/4 mass scaling power law. We argue that the declining mass-specific clearance rates with size within taxa is related to the inherent decrease in feeding efficiency of any particular feeding mode. The transitions between feeding mode and simultaneous transitions in clearance and respiration rates may then represent adaptations to the food environment and be the result of the optimization of tradeoffs that allow sufficient feeding and growth rates to balance mortality.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this work we have investigated some aspects of the two-dimensional flow of a viscous Newtonian fluid through a disordered porous medium modeled by a random fractal system similar to the Sierpinski carpet. This fractal is formed by obstacles of various sizes, whose distribution function follows a power law. They are randomly disposed in a rectangular channel. The velocity field and other details of fluid dynamics are obtained by solving numerically of the Navier-Stokes and continuity equations at the pore level, where occurs actually the flow of fluids in porous media. The results of numerical simulations allowed us to analyze the distribution of shear stresses developed in the solid-fluid interfaces, and find algebraic relations between the viscous forces or of friction with the geometric parameters of the model, including its fractal dimension. Based on the numerical results, we proposed scaling relations involving the relevant parameters of the phenomenon, allowing quantifying the fractions of these forces with respect to size classes of obstacles. Finally, it was also possible to make inferences about the fluctuations in the form of the distribution of viscous stresses developed on the surface of obstacles.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Dark matter is a fundamental ingredient of the modern Cosmology. It is necessary in order to explain the process of structures formation in the Universe, rotation curves of galaxies and the mass discrepancy in clusters of galaxies. However, although many efforts, in both aspects, theoretical and experimental, have been made, the nature of dark matter is still unknown and the only convincing evidence for its existence is gravitational. This rises doubts about its existence and, in turn, opens the possibility that the Einstein’s gravity needs to be modified at some scale. We study, in this work, the possibility that the Eddington-Born-Infeld (EBI) modified gravity provides en alternative explanation for the mass discrepancy in clusters of galaxies. For this purpose we derive the modified Einstein field equations and find their solutions to a spherical system of identical and collisionless point particles. Then, we took into account the collisionless relativistic Boltzmann equation and using some approximations and assumptions for weak gravitational field, we derived the generalized virial theorem in the framework of EBI gravity. In order to compare the predictions of EBI gravity with astrophysical observations we estimated the order of magnitude of the geometric mass, showing that it is compatible with present observations. Finally, considering a power law for the density of galaxies in the cluster, we derived expressions for the radial velocity dispersion of the galaxies, which can be used for testing some features of the EBI gravity.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Dark matter is a fundamental ingredient of the modern Cosmology. It is necessary in order to explain the process of structures formation in the Universe, rotation curves of galaxies and the mass discrepancy in clusters of galaxies. However, although many efforts, in both aspects, theoretical and experimental, have been made, the nature of dark matter is still unknown and the only convincing evidence for its existence is gravitational. This rises doubts about its existence and, in turn, opens the possibility that the Einstein’s gravity needs to be modified at some scale. We study, in this work, the possibility that the Eddington-Born-Infeld (EBI) modified gravity provides en alternative explanation for the mass discrepancy in clusters of galaxies. For this purpose we derive the modified Einstein field equations and find their solutions to a spherical system of identical and collisionless point particles. Then, we took into account the collisionless relativistic Boltzmann equation and using some approximations and assumptions for weak gravitational field, we derived the generalized virial theorem in the framework of EBI gravity. In order to compare the predictions of EBI gravity with astrophysical observations we estimated the order of magnitude of the geometric mass, showing that it is compatible with present observations. Finally, considering a power law for the density of galaxies in the cluster, we derived expressions for the radial velocity dispersion of the galaxies, which can be used for testing some features of the EBI gravity.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Infrared selection is a potentially powerful way to identify heavily obscured AGNs missed in even the deepest X-ray surveys. Using a 24 μm-selected sample in GOODS-S, we test the reliability and completeness of three infrared AGN selection methods: (1) IRAC color-color selection, (2) IRAC power-law selection, and (3) IR-excess selection; we also evaluate a number of IR-excess approaches. We find that the vast majority of non-power-law IRAC color-selected AGN candidates in GOODS-S have colors consistent with those of star-forming galaxies. Contamination by star-forming galaxies is most prevalent at low 24 μm flux densities (~100 μJy) and high redshifts (z ~ 2), but the fraction of potential contaminants is still high (~50%) at 500 μJy, the highest flux density probed reliably by our survey. AGN candidates selected via a simple, physically motivated power-law criterion ("power-law galaxies," or PLGs), however, appear to be reliable. We confirm that the IR-excess methods successfully identify a number of AGNs, but we also find that such samples may be significantly contaminated by star-forming galaxies. Adding only the secure Spitzer-selected PLG, color-selected, IR-excess, and radio/IR-selected AGN candidates to the deepest X-ray-selected AGN samples directly increases the number of known X-ray AGNs (84) by 54%-77%, and implies an increase to the number of 24 μm-detected AGNs of 71%-94%. Finally, we show that the fraction of MIR sources dominated by an AGN decreases with decreasing MIR flux density, but only down to f_24 μ m = 300 μJy. Below this limit, the AGN fraction levels out, indicating that a nonnegligible fraction (~10%) of faint 24 μm sources (the majority of which are missed in the X-ray) are powered not by star formation, but by the central engine. The fraction of all AGNs (regardless of their MIR properties) exceeds 15% at all 24 μm flux densities.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We present a study of the Galactic Center region as a possible source of both secondary gamma-ray and neutrino fluxes from annihilating dark matter. We have studied the gamma-ray flux observed by the High Energy Stereoscopic System (HESS) from the J1745-290 Galactic Center source. The data are well fitted as annihilating dark matter in combination with an astrophysical background. The analysis was performed by means of simulated gamma spectra produced by Monte Carlo event generators packages. We analyze the differences in the spectra obtained by the various Monte Carlo codes developed so far in particle physics. We show that, within some uncertainty, the HESS data can be fitted as a signal from a heavy dark matter density distribution peaked at the Galactic Center, with a power-law for the background with a spectral index which is compatible with the Fermi-Large Area Telescope (LAT) data from the same region. If this kind of dark matter distribution generates the gamma-ray flux observed by HESS, we also expect to observe a neutrino flux. We show prospective results for the observation of secondary neutrinos with the Astronomy with a Neutrino Telescope and Abyss environmental RESearch project (ANTARES), Ice Cube Neutrino Observatory (Ice Cube) and the Cubic Kilometer Neutrino Telescope (KM3NeT). Prospects solely depend on the device resolution angle when its effective area and the minimum energy threshold are fixed.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Tropical Cyclones are a continuing threat to life and property. Willoughby (2012) found that a Pareto (power-law) cumulative distribution fitted to the most damaging 10% of US hurricane seasons fit their impacts well. Here, we find that damage follows a Pareto distribution because the assets at hazard follow a Zipf distribution, which can be thought of as a Pareto distribution with exponent 1. The Z-CAT model is an idealized hurricane catastrophe model that represents a coastline where populated places with Zipf- distributed assets are randomly scattered and damaged by virtual hurricanes with sizes and intensities generated through a Monte-Carlo process. Results produce realistic Pareto exponents. The ability of the Z-CAT model to simulate different climate scenarios allowed testing of sensitivities to Maximum Potential Intensity, landfall rates and building structure vulnerability. The Z-CAT model results demonstrate that a statistical significant difference in damage is found when only changes in the parameters create a doubling of damage.