45 resultados para AGGREGATE-SIZE DISTRIBUTIONS
Resumo:
In this paper we analyze the persistence of aggregate real exchange rates (RERs) for a group of EU-15 countries by using sectoral data. The tight relation between aggregate and sectoral persistence recently investigated by Mayoral (2008) allows us to decompose aggregate RER persistence into the persistence of its different subcomponents. We show that the distribution of sectoral persistence is highly heterogeneous and very skewed to the right, and that a limited number of sectors are responsible for the high levels of persistence observed at the aggregate level. We use quantile regression to investigate whether the traditional theories proposed to account for the slow reversion to parity (lack of arbitrage due to nontradibilities or imperfect competition and price stickiness) are able to explain the behavior of the upper quantiles of sectoral persistence. We conclude that pricing to market in the intermediate goods sector together with price stickiness have more explanatory power than variables related to the tradability of the goods or their inputs.
Resumo:
We investigate the transition to synchronization in the Kuramoto model with bimodal distributions of the natural frequencies. Previous studies have concluded that the model exhibits a hysteretic phase transition if the bimodal distribution is close to a unimodal one, due to the shallowness the central dip. Here we show that proximity to the unimodal-bimodal border does not necessarily imply hysteresis when the width, but not the depth, of the central dip tends to zero. We draw this conclusion from a detailed study of the Kuramoto model with a suitable family of bimodal distributions.
Resumo:
We present a real data set of claims amounts where costs related to damage are recorded separately from those related to medical expenses. Only claims with positive costs are considered here. Two approaches to density estimation are presented: a classical parametric and a semi-parametric method, based on transformation kernel density estimation. We explore the data set with standard univariate methods. We also propose ways to select the bandwidth and transformation parameters in the univariate case based on Bayesian methods. We indicate how to compare the results of alternative methods both looking at the shape of the overall density domain and exploring the density estimates in the right tail.
Resumo:
We introduce a model of redistributive income taxation and public expenditure. This joint treatment permits analyzing the interdependencies between the two policies: one cannot be chosen independently of the other. Empirical evidence reveals that partisan confrontation essentially falls on expenditure policies rather than on income taxation. We examine the case in which the expenditure policy (or the size of government) is chosen by majority voting and income taxation is consistently adjusted. This adjustment consists of designing the income tax schedule that, given the expenditure policy, achieves consensus among the population. The model determines the consensus in- come tax schedule, the composition of public expenditure and the size of government. The main results are that inequality is negatively related to the size of government and to the pro-rich bias in public expenditure, and positively or negatively related to the marginal income tax, depending on substitutability between government supplied and market goods. These implications are validated using OECD data.
Resumo:
"Vegeu el resum a l'inici del document del fitxer adjunt."
Resumo:
The availability of rich firm-level data sets has recently led researchers to uncover new evidence on the effects of trade liberalization. First, trade openness forces the least productive firms to exit the market. Secondly, it induces surviving firms to increase their innovation efforts and thirdly, it increases the degree of product market competition. In this paper we propose a model aimed at providing a coherent interpretation of these findings. We introducing firm heterogeneity into an innovation-driven growth model, where incumbent firms operating in oligopolistic industries perform cost-reducing innovations. In this framework, trade liberalization leads to higher product market competition, lower markups and higher quantity produced. These changes in markups and quantities, in turn, promote innovation and productivity growth through a direct competition effect, based on the increase in the size of the market, and a selection effect, produced by the reallocation of resources towards more productive firms. Calibrated to match US aggregate and firm-level statistics, the model predicts that a 10 percent reduction in variable trade costs reduces markups by 1:15 percent, firm surviving probabilities by 1 percent, and induces an increase in productivity growth of about 13 percent. More than 90 percent of the trade-induced growth increase can be attributed to the selection effect.
Resumo:
We develop a mediation model in which firm size is proposed to affect the scale and quality of innovative output through the adoption of different decision styles during the R&D process. The aim of this study is to understand how the internal changes that firms undergo as they evolve from small to larger organizations affect R&D productivity. In so doing, we illuminate the underlying theoretical mechanism affecting two different dimensions of R&D productivity, namely the scale and quality of innovative output which have not received much attention in previous literature. Using longitudinal data of Spanish manufacturing firms we explore the validity of this mediation model. Our results show that as firms evolve in size, they increasingly emphasize analytical decision making, and consequently, large-sized firms aim for higher-quality innovations while small firms aim for a larger scale of innovative output.
Resumo:
"Vegeu el resum a l'inici del document del fitxer adjunt"
Resumo:
Based on the Ahumada et al. (2007, Review of Income and Wealth) critique we revise existing estimates of the size of the German underground economy. Among other things, it turns out that most of these estimates are untenable and that the tax pressure induced size of the German underground economy may be much lower than previously thought. To this extent, German policy and law makers have been misguided during the last three decades. Therefore, we introduce the Modified-Cash-Deposit-Ratio (MCDR) approach, which is not subject to the recent critique and apply it to Germany for the period 1960 to 2008. JEL: O17, Q41, C22, Keywords: underground economy, shadow economy, cash-depositratio, currency demand approach, MIMIC approach
Resumo:
This paper presents an analysis of motor vehicle insurance claims relating to vehicle damage and to associated medical expenses. We use univariate severity distributions estimated with parametric and non-parametric methods. The methods are implemented using the statistical package R. Parametric analysis is limited to estimation of normal and lognormal distributions for each of the two claim types. The nonparametric analysis presented involves kernel density estimation. We illustrate the benefits of applying transformations to data prior to employing kernel based methods. We use a log-transformation and an optimal transformation amongst a class of transformations that produces symmetry in the data. The central aim of this paper is to provide educators with material that can be used in the classroom to teach statistical estimation methods, goodness of fit analysis and importantly statistical computing in the context of insurance and risk management. To this end, we have included in the Appendix of this paper all the R code that has been used in the analysis so that readers, both students and educators, can fully explore the techniques described
Resumo:
This paper develops a simple model that can be used to analyze the long-term sustainability of the contributive pension system and the steady-state response of pension expenditure to changes in some key demographic and economic variables, in the characteristics of the average pensioner and in the parameters that describe how pensions are calculated in Spain as a function of workers' Social Security contribution histories.
Resumo:
We analyze the statistics of rain-event sizes, rain-event durations, and dry-spell durations in a network of 20 rain gauges scattered in an area situated close to the NW Mediterranean coast. Power-law distributions emerge clearly for the dryspell durations, with an exponent around 1.50 ± 0.05, although for event sizes and durations the power-law ranges are rather limited, in some cases. Deviations from power-law behavior are attributed to finite-size effects. A scaling analysis helps to elucidate the situation, providing support for the existence of scale invariance in these distributions. It is remarkable that rain data of not very high resolution yield findings in agreement with self-organized critical phenomena.
Resumo:
We investigate the determinants of teamwork and workers cooperation within the firm. Up to now the literature has almost exclusively focused on workers incentives as the main determinants for workers cooperation. We take a broader look at the firm's organizational design and analyze the impact that different aspects of it might have on cooperation. In particular, we consider the way in which the degree of decentralization of decisions and the use of complementary HRM practices (what we call the .rm.s vertical organizational design) can affect workers'collaboration with each other. We test the model's predictions on a unique dataset on Spanish small and medium size firms containing a rich set of variables that allows us to use sensible proxies for workers cooperation. We find that the decentralization of labor decisions (and to a less extent that of task planning) has a positive impact on workers cooperation. Likewise, cooperation is positively correlated to many of the HRM practices that seem to favor workers'interaction the most. We also confirm the previous finding that collaborative efforts respond positively to pay incentives, and particularly, to group or company incentives.
Resumo:
Report for the scientific sojourn carried out at the University of California at Berkeley, from September to December 2007. Environmental niche modelling (ENM) techniques are powerful tools to predict species potential distributions. In the last ten years, a plethora of novel methodological approaches and modelling techniques have been developed. During three months, I stayed at the University of California, Berkeley, working under the supervision of Dr. David R. Vieites. The aim of our work was to quantify the error committed by these techniques, but also to test how an increase in the sample size affects the resultant predictions. Using MaxEnt software we generated distribution predictive maps, from different sample sizes, of the Eurasian quail (Coturnix coturnix) in the Iberian Peninsula. The quail is a generalist species from a climatic point of view, but an habitat specialist. The resultant distribution maps were compared with the real distribution of the species. This distribution was obtained from recent bird atlases from Spain and Portugal. Results show that ENM techniques can have important errors when predicting the species distribution of generalist species. Moreover, an increase of sample size is not necessary related with a better performance of the models. We conclude that a deep knowledge of the species’ biology and the variables affecting their distribution is crucial for an optimal modelling. The lack of this knowledge can induce to wrong conclusions.
Resumo:
Tropical cyclones are affected by a large number of climatic factors, which translates into complex patterns of occurrence. The variability of annual metrics of tropical-cyclone activity has been intensively studied, in particular since the sudden activation of the North Atlantic in the mid 1990’s. We provide first a swift overview on previous work by diverse authors about these annual metrics for the North-Atlantic basin, where the natural variability of the phenomenon, the existence of trends, the drawbacks of the records, and the influence of global warming have been the subject of interesting debates. Next, we present an alternative approach that does not focus on seasonal features but on the characteristics of single events [Corral et al., Nature Phys. 6, 693 (2010)]. It is argued that the individual-storm power dissipation index (PDI) constitutes a natural way to describe each event, and further, that the PDI statistics yields a robust law for the occurrence of tropical cyclones in terms of a power law. In this context, methods of fitting these distributions are discussed. As an important extension to this work we introduce a distribution function that models the whole range of the PDI density (excluding incompleteness effects at the smallest values), the gamma distribution, consisting in a powerlaw with an exponential decay at the tail. The characteristic scale of this decay, represented by the cutoff parameter, provides very valuable information on the finiteness size of the basin, via the largest values of the PDIs that the basin can sustain. We use the gamma fit to evaluate the influence of sea surface temperature (SST) on the occurrence of extreme PDI values, for which we find an increase around 50 % in the values of these basin-wide events for a 0.49 C SST average difference. Similar findings are observed for the effects of the positive phase of the Atlantic multidecadal oscillation and the number of hurricanes in a season on the PDI distribution. In the case of the El Niño Southern oscillation (ENSO), positive and negative values of the multivariate ENSO index do not have a significant effect on the PDI distribution; however, when only extreme values of the index are used, it is found that the presence of El Niño decreases the PDI of the most extreme hurricanes.