995 resultados para macroeconomic data
Resumo:
The paper is a new element in a series of studies analyzing macroeconomic inventory behavior by use of multi-country data. In this paper, seven hypotheses are tested with positive result. These hypotheses include subjects like relations of inventories with growth and with some other macroeconomic indicators of the use of GDP and the long-term tendencies of global inventory formations. Multivariate statistical analysis is used for evaluation.
Resumo:
This paper focuses on the operational drivers of labour productivity changes. We consider two sets of drivers: a) current working practices b) changes in working practices through management programs. The relationship between these two sets of drivers and productivity changes are analysed. We also investigate the importance of productivity growth by looking at the impact of labour productivity changes on business performance changes. Finally, the moderating effects of industry and country on the use of drivers of productivity changes are examined. Data from an international survey, IMSS-IV, are used for the analysis.
Resumo:
Léon Walras (1874) already had realized that his neo-classical general equilibrium model could not accommodate autonomous investment. Sen analysed the same issue in a simple, one-sector macroeconomic model of a closed economy. He showed that fixing investment in the model, built strictly on neo-classical assumptions, would make the system overdetermined, thus, one should loosen some neo-classical condition of competitive equilibrium. He analysed three not neo-classical “closure options”, which could make the model well determined in the case of fixed investment. Others later extended his list and it showed that the closure dilemma arises in the more complex computable general equilibrium (CGE) models as well, as does the choice of adjustment mechanism assumed to bring about equilibrium at the macro level. By means of numerical models, it was also illustrated that the adopted closure rule can significantly affect the results of policy simulations based on a CGE model. Despite these warnings, the issue of macro closure is often neglected in policy simulations. It is, therefore, worth revisiting the issue and demonstrating by further examples its importance, as well as pointing out that the closure problem in the CGE models extends well beyond the problem of how to incorporate autonomous investment into a CGE model. Several closure rules are discussed in this paper and their diverse outcomes are illustrated by numerical models calibrated on statistical data. First, the analyses is done in a one-sector model, similar to Sen’s, but extended into a model of an open economy. Next, the same analyses are repeated using a fully-fledged multisectoral CGE model, calibrated on the same statistical data. Comparing the results obtained by the two models it is shown that although, using the same closure option, they generate quite similar results in terms of the direction and – to a somewhat lesser extent – of the magnitude of change in the main macro variables, the predictions of the multi-sectoral CGE model are clearly more realistic and balanced.
Resumo:
This dissertation explores three aspects of the economics and policy issues surrounding retail payments (low-value frequent payments): the microeconomic aspect, by measuring costs associated with retail payment instruments; the macroeconomic aspect, by quantifying the impact of the use of electronic rather than paper-based payment instruments on consumption and GDP; and the policy aspect, by identifying barriers that keep countries stuck with outdated payment systems, and recommending policy interventions to move forward with payments modernization. Payment system modernization has become a prominent part of the financial sector reform agenda in many advanced and developing countries. Greater use of electronic payments rather than cash and other paper-based instruments would have important economic and social benefits, including lower costs and thereby increased economic efficiency and higher incomes, while broadening access to the financial system, notably for people with moderate and low incomes. The dissertation starts with a general introduction on retail payments. Chapter 1 develops a theoretical model for measuring payments costs, and applies the model to Guyana—an emerging market in the midst of the transition from paper to electronic payments. Using primary survey data from Guyanese consumers, the results of the analysis indicate that annual costs related to the use of cash by consumers reach 2.5 percent of the country’s GDP. Switching to electronic payment instruments would provide savings amounting to 1 percent of GDP per year. Chapter 2 broadens the analysis to calculate the macroeconomic impacts of a move to electronic payments. Using a unique panel dataset of 76 countries across the 17-year span from 1998 to 2014 and a pooled OLS country fixed effects model, Chapter 2 finds that on average, use of debit and credit cards contribute USD 16.2 billion to annual global consumption, and USD 160 billion to overall annual global GDP. Chapter 3 provides an in-depth assessment of the Albanian payment cards and remittances market and recommends a set of incentives and regulations (both carrots and sticks) that would allow the country to modernize its payment system. Finally, the conclusion summarizes the lessons of the dissertation’s research and brings forward issues to be explored by future research in the retail payments area.
Resumo:
This dissertation describes two studies on macroeconomic trends and cycles. The first chapter studies the impact of Information Technology (IT) on the U.S. labor market. Over the past 30 years, employment and income shares of routine-intensive occupations have declined significantly relative to nonroutine occupations, and the overall U.S. labor income share has declined relative to capital. Furthermore, the decline of routine employment has been largely concentrated during recessions and ensuing recoveries. I build a model of unbalanced growth to assess the role of computerization and IT in driving these labor market trends and cycles. I augment a neoclassical growth model with exogenous IT progress as a form of Routine-Biased Technological Change (RBTC). I show analytically that RBTC causes the overall labor income share to follow a U-shaped time path, as the monotonic decline of routine labor share is increasingly offset by the monotonic rise of nonroutine labor share and the elasticity of substitution between the overall labor and capital declines under IT progress. Quantitatively, the model explains nearly all the divergence between routine and nonroutine labor in the period 1986-2014, as well as the mild decline of the overall labor share between 1986 and the early 2000s. However, the model with IT progress alone cannot explain the accelerated decline of labor income share after the early 2000s, suggesting that other factors, such as globalization, may have played a larger role in this period. Lastly, when nonconvex labor adjustment costs are present, the model generates a stepwise decline in routine labor hours, qualitatively consistent with the data. The timing of these trend adjustments can be significantly affected by aggregate productivity shocks and concentrated in recessions. The second chapter studies the implications of loss aversion on the business cycle dynamics of aggregate consumption and labor hours. Loss aversion refers to the fact that people are distinctively more sensitive to losses than to gains. Loss averse agents are very risk averse around the reference point and exhibit asymmetric responses to positive and negative income shocks. In an otherwise standard Real Business Cycle (RBC) model, I study loss aversion in both consumption alone and consumption-and-leisure together. My results indicate that how loss aversion affects business cycle dynamics depends critically on the nature of the reference point. If, for example, the reference point is status quo, loss aversion dramatically lowers the effective inter-temporal rate of substitution and induces excessive consumption smoothing. In contrast, if the reference point is fixed at a constant level, loss aversion generates a flat region in the decision rules and asymmetric impulse responses to technology shocks. Under a reasonable parametrization, loss aversion has the potential to generate asymmetric business cycles with deeper and more prolonged recessions.
Resumo:
Doutoramento em Economia
Resumo:
Using annual data from 14 European Union countries, plus Canada, Japan and the United States, we evaluate the macroeconomic effects of public and private investment through VAR analysis. From impulse response functions, we are able to assess the extent of crowding-in or crowding-out of both components of investment. We also compute the associated macroeconomic rates of return of public and private investment for each country. The results point mostly to the existence of positive effects of public investment and private investment on output. On the other hand, the crowding-in effects of public investment on private investment vary across countries, while the crowding-in effect of private investment on public investment is more generalised.
Resumo:
We examine the efficiency of multivariate macroeconomic forecasts by estimating a vector autoregressive model on the forecast revisions of four variables (GDP, inflation, unemployment and wages). Using a data set of professional forecasts for the G7 countries, we find evidence of cross‐series revision dynamics. Specifically, forecasts revisions are conditionally correlated to the lagged forecast revisions of other macroeconomic variables, and the sign of the correlation is as predicted by conventional economic theory. This indicates that forecasters are slow to incorporate news across variables. We show that this finding can be explained by forecast underreaction.
Resumo:
Contingent Protection has grown to become an important trade restricting device. In the European Union, protection instruments like antidumping are used extensively. This paper analyses whether macroeconomic pressures may contribute to explain the variations in the intensity of antidumping protectionism in the EU. The empirical analysis uses count data models, applying various specification tests to derive the most appropriate specification. Our results suggest that the filing activity is inversely related to the macroeconomic conditions. Moreover, they confirm existing evidence for the US suggesting that domestic macroeconomic pressures are a more important determinant of contingent protection policy than external pressures.
Resumo:
High-throughput screening of physical, genetic and chemical-genetic interactions brings important perspectives in the Systems Biology field, as the analysis of these interactions provides new insights into protein/gene function, cellular metabolic variations and the validation of therapeutic targets and drug design. However, such analysis depends on a pipeline connecting different tools that can automatically integrate data from diverse sources and result in a more comprehensive dataset that can be properly interpreted. We describe here the Integrated Interactome System (IIS), an integrative platform with a web-based interface for the annotation, analysis and visualization of the interaction profiles of proteins/genes, metabolites and drugs of interest. IIS works in four connected modules: (i) Submission module, which receives raw data derived from Sanger sequencing (e.g. two-hybrid system); (ii) Search module, which enables the user to search for the processed reads to be assembled into contigs/singlets, or for lists of proteins/genes, metabolites and drugs of interest, and add them to the project; (iii) Annotation module, which assigns annotations from several databases for the contigs/singlets or lists of proteins/genes, generating tables with automatic annotation that can be manually curated; and (iv) Interactome module, which maps the contigs/singlets or the uploaded lists to entries in our integrated database, building networks that gather novel identified interactions, protein and metabolite expression/concentration levels, subcellular localization and computed topological metrics, GO biological processes and KEGG pathways enrichment. This module generates a XGMML file that can be imported into Cytoscape or be visualized directly on the web. We have developed IIS by the integration of diverse databases following the need of appropriate tools for a systematic analysis of physical, genetic and chemical-genetic interactions. IIS was validated with yeast two-hybrid, proteomics and metabolomics datasets, but it is also extendable to other datasets. IIS is freely available online at: http://www.lge.ibi.unicamp.br/lnbio/IIS/.
Resumo:
The article seeks to investigate patterns of performance and relationships between grip strength, gait speed and self-rated health, and investigate the relationships between them, considering the variables of gender, age and family income. This was conducted in a probabilistic sample of community-dwelling elderly aged 65 and over, members of a population study on frailty. A total of 689 elderly people without cognitive deficit suggestive of dementia underwent tests of gait speed and grip strength. Comparisons between groups were based on low, medium and high speed and strength. Self-related health was assessed using a 5-point scale. The males and the younger elderly individuals scored significantly higher on grip strength and gait speed than the female and oldest did; the richest scored higher than the poorest on grip strength and gait speed; females and men aged over 80 had weaker grip strength and lower gait speed; slow gait speed and low income arose as risk factors for a worse health evaluation. Lower muscular strength affects the self-rated assessment of health because it results in a reduction in functional capacity, especially in the presence of poverty and a lack of compensatory factors.
Resumo:
Obstructive sleep apnea syndrome has a high prevalence among adults. Cephalometric variables can be a valuable method for evaluating patients with this syndrome. To correlate cephalometric data with the apnea-hypopnea sleep index. We performed a retrospective and cross-sectional study that analyzed the cephalometric data of patients followed in the Sleep Disorders Outpatient Clinic of the Discipline of Otorhinolaryngology of a university hospital, from June 2007 to May 2012. Ninety-six patients were included, 45 men, and 51 women, with a mean age of 50.3 years. A total of 11 patients had snoring, 20 had mild apnea, 26 had moderate apnea, and 39 had severe apnea. The distance from the hyoid bone to the mandibular plane was the only variable that showed a statistically significant correlation with the apnea-hypopnea index. Cephalometric variables are useful tools for the understanding of obstructive sleep apnea syndrome. The distance from the hyoid bone to the mandibular plane showed a statistically significant correlation with the apnea-hypopnea index.
Resumo:
In acquired immunodeficiency syndrome (AIDS) studies it is quite common to observe viral load measurements collected irregularly over time. Moreover, these measurements can be subjected to some upper and/or lower detection limits depending on the quantification assays. A complication arises when these continuous repeated measures have a heavy-tailed behavior. For such data structures, we propose a robust structure for a censored linear model based on the multivariate Student's t-distribution. To compensate for the autocorrelation existing among irregularly observed measures, a damped exponential correlation structure is employed. An efficient expectation maximization type algorithm is developed for computing the maximum likelihood estimates, obtaining as a by-product the standard errors of the fixed effects and the log-likelihood function. The proposed algorithm uses closed-form expressions at the E-step that rely on formulas for the mean and variance of a truncated multivariate Student's t-distribution. The methodology is illustrated through an application to an Human Immunodeficiency Virus-AIDS (HIV-AIDS) study and several simulation studies.
Resumo:
To assess the completeness and reliability of the Information System on Live Births (Sinasc) data. A cross-sectional analysis of the reliability and completeness of Sinasc's data was performed using a sample of Live Birth Certificate (LBC) from 2009, related to births from Campinas, Southeast Brazil. For data analysis, hospitals were grouped according to category of service (Unified National Health System, private or both), 600 LBCs were randomly selected and the data were collected in LBC-copies through mothers and newborns' hospital records and by telephone interviews. The completeness of LBCs was evaluated, calculating the percentage of blank fields, and the LBCs agreement comparing the originals with the copies was evaluated by Kappa and intraclass correlation coefficients. The percentage of completeness of LBCs ranged from 99.8%-100%. For the most items, the agreement was excellent. However, the agreement was acceptable for marital status, maternal education and newborn infants' race/color, low for prenatal visits and presence of birth defects, and very low for the number of deceased children. The results showed that the municipality Sinasc is reliable for most of the studied variables. Investments in training of the professionals are suggested in an attempt to improve system capacity to support planning and implementation of health activities for the benefit of maternal and child population.
Resumo:
Often in biomedical research, we deal with continuous (clustered) proportion responses ranging between zero and one quantifying the disease status of the cluster units. Interestingly, the study population might also consist of relatively disease-free as well as highly diseased subjects, contributing to proportion values in the interval [0, 1]. Regression on a variety of parametric densities with support lying in (0, 1), such as beta regression, can assess important covariate effects. However, they are deemed inappropriate due to the presence of zeros and/or ones. To evade this, we introduce a class of general proportion density, and further augment the probabilities of zero and one to this general proportion density, controlling for the clustering. Our approach is Bayesian and presents a computationally convenient framework amenable to available freeware. Bayesian case-deletion influence diagnostics based on q-divergence measures are automatic from the Markov chain Monte Carlo output. The methodology is illustrated using both simulation studies and application to a real dataset from a clinical periodontology study.