37 resultados para minimum wages
Resumo:
This study uses wage data from the UBS Prices and Earnings survey to highlight Disparate Wages in a Globalized World from di↵erent perspectives. This wage data is characterised by remarkable consistency over the last 40 years, as well as unusual global comparability. In the first chapter we analyse the convergence hypothesis for purchasing power adjusted wages across the world for 1970 to 2009. The results provide solid evidence for the hypotheses of absolute and conditional convergence in real wages, with the key driver being faster overall growing wage levels in lower wage countries compared to higher wage countries. At the same time, the highest skilled professions have experienced the highest wage growth, while low skilled workers’ wages have lagged, thus no convergence in this sense is found between skill groups. In the second chapter we examine deviations in international wages from Factor Price Equalisation theory (FPE). Following an approach analogous to Engel (1993) we find that deviations from FPE are more likely driven by the higher variability of wages between countries than by the variability of di↵erent wages within countries. With regard to the traditional analysis of the real exchange rate and the Balassa-Samuelson assumptions our analysis points to a larger impact on the real exchange rate likely stemming from the movements in the real exchange rate of tradables, and only to a lesser extent from the lack of equalisation of wages within countries. In the third chapter our results show that India’s economic and trade liberalisation, starting in the early 1990s, had very di↵erential impacts on skill premia, both over time and over skill levels. The most striking result is the large increase in wage inequality of high-skilled versus low-skilled professions. Both the synthetic control group method and the di↵erence-in-di↵erences (DID) approach suggest that a significant part of this increase in wage inequality can be attributed to India’s liberalisation.
Resumo:
Recent findings demonstrate that trees in deserts are efficient carbon sinks. It remains however unknown whether the Clean Development Mechanism will accelerate the planting of trees in Non Annex I dryland countries. We estimated the price of carbon at which a farmer would be indifferent between his customary activity and the planting of trees to trade carbon credits, along an aridity gradient. Carbon yields were simulated by means of the CO2FIX v3.1 model for Pinus halepensis with its respective yield classes along the gradient (Arid – 100mm to Dry Sub Humid conditions – 900mm). Wheat and pasture yields were predicted on somewhat similar nitrogen-based quadratic models, using 30 years of weather data to simulate moisture stress. Stochastic production, input and output prices were afterwards simulated on a Monte Carlo matrix. Results show that, despite the high levels of carbon uptake, carbon trading by afforesting is unprofitable anywhere along the gradient. Indeed, the price of carbon would have to raise unrealistically high, and the certification costs would have to drop significantly, to make the Clean Development Mechanism worthwhile for non annex I dryland countries farmers. From a government agency's point of view the Clean Development Mechanism is attractive. However, such agencies will find it difficult to demonstrate “additionality”, even if the rule may be somewhat flexible. Based on these findings, we will further discuss why the Clean Development Mechanism, a supposedly pro-poor instrument, fails to assist farmers in Non Annex I dryland countries living at minimum subsistence level.
Resumo:
In this article, the realization of a global terrestrial reference system (TRS) based on a consistent combination of Global Navigation Satellite System (GNSS) and Satellite Laser Ranging (SLR) is studied. Our input data consists of normal equation systems from 17 years (1994– 2010) of homogeneously reprocessed GPS, GLONASS and SLR data. This effort used common state of the art reduction models and the same processing software (Bernese GNSS Software) to ensure the highest consistency when combining GNSS and SLR. Residual surface load deformations are modeled with a spherical harmonic approach. The estimated degree-1 surface load coefficients have a strong annual signal for which the GNSS- and SLR-only solutions show very similar results. A combination including these coefficients reduces systematic uncertainties in comparison to the singletechnique solution. In particular, uncertainties due to solar radiation pressure modeling in the coefficient time series can be reduced up to 50 % in the GNSS+SLR solution compared to the GNSS-only solution. In contrast to the ITRF2008 realization, no local ties are used to combine the different geodetic techniques.We combine the pole coordinates as global ties and apply minimum constraints to define the geodetic datum. We show that a common origin, scale and orientation can be reliably realized from our combination strategy in comparison to the ITRF2008.
Resumo:
Charcoal particles in pollen slides are often abundant, and thus analysts are faced with the problem of setting the minimum counting sum as small as possible in order to save time. We analysed the reliability of charcoal-concentration estimates based on different counting sums, using simulated low-to high-count samples. Bootstrap simulations indicate that the variability of inferred charcoal concentrations increases progressively with decreasing sums. Below 200 items (i.e., the sum of charcoal particles and exotic marker grains), reconstructed fire incidence is either too high or too low. Statistical comparisons show that the means of bootstrap simulations stabilize after 200 counts. Moreover, a count of 200-300 items is sufficient to produce a charcoal-concentration estimate with less than+5% error if compared with high-count samples of 1000 items for charcoal/marker grain ratios 0.1-0.91. If, however, this ratio is extremely high or low (> 0.91 or < 0.1) and if such samples are frequent, we suggest that marker grains are reduced or added prior to new sample processing.
Resumo:
Throughout the last millennium, mankind was affected by prolonged deviations from the climate mean state. While periods like the Maunder Minimum in the 17th century have been assessed in greater detail, earlier cold periods such as the 15th century received much less attention due to the sparse information available. Based on new evidence from different sources ranging from proxy archives to model simulations, it is now possible to provide an end-to-end assessment about the climate state during an exceptionally cold period in the 15th century, the role of internal, unforced climate variability and external forcing in shaping these extreme climatic conditions, and the impacts on and responses of the medieval society in Central Europe. Climate reconstructions from a multitude of natural and human archives indicate that, during winter, the period of the early Spörer Minimum (1431–1440 CE) was the coldest decade in Central Europe in the 15th century. The particularly cold winters and normal but wet summers resulted in a strong seasonal cycle that challenged food production and led to increasing food prices, a subsistence crisis, and a famine in parts of Europe. As a consequence, authorities implemented adaptation measures, such as the installation of grain storage capacities, in order to be prepared for future events. The 15th century is characterised by a grand solar minimum and enhanced volcanic activity, which both imply a reduction of seasonality. Climate model simulations show that periods with cold winters and strong seasonality are associated with internal climate variability rather than external forcing. Accordingly, it is hypothesised that the reconstructed extreme climatic conditions during this decade occurred by chance and in relation to the partly chaotic, internal variability within the climate system.