879 resultados para General allocation model
Resumo:
Constant technology advances have caused data explosion in recent years. Accord- ingly modern statistical and machine learning methods must be adapted to deal with complex and heterogeneous data types. This phenomenon is particularly true for an- alyzing biological data. For example DNA sequence data can be viewed as categorical variables with each nucleotide taking four different categories. The gene expression data, depending on the quantitative technology, could be continuous numbers or counts. With the advancement of high-throughput technology, the abundance of such data becomes unprecedentedly rich. Therefore efficient statistical approaches are crucial in this big data era.
Previous statistical methods for big data often aim to find low dimensional struc- tures in the observed data. For example in a factor analysis model a latent Gaussian distributed multivariate vector is assumed. With this assumption a factor model produces a low rank estimation of the covariance of the observed variables. Another example is the latent Dirichlet allocation model for documents. The mixture pro- portions of topics, represented by a Dirichlet distributed variable, is assumed. This dissertation proposes several novel extensions to the previous statistical methods that are developed to address challenges in big data. Those novel methods are applied in multiple real world applications including construction of condition specific gene co-expression networks, estimating shared topics among newsgroups, analysis of pro- moter sequences, analysis of political-economics risk data and estimating population structure from genotype data.
Resumo:
The purpose of this dissertation is to examine three distributional issues in macroeconomics. First I explore the effects fiscal federalism on economic growth across regions in China. Using the comprehensive official data set of China for 31 regions from 1952 until 1999, I investigate a number of indicators used by the literature to measure federalism and find robust support for only one such measure: the ratio of local total revenue to local tax revenue. Using a difference-in-difference approach and exploiting the two-year gap in the implementation of a tax reform across different regions of China, I also identify a positive relationship between fiscal federalism and regional economic growth. The second paper hypothesizes that an inequitable distribution of income negatively affects the rule of law in resource-rich economies and provides robust evidence in support of this hypothesis. By investigating a data set that contains 193 countries and using econometric methodologies such as the fixed effects estimator and the generalized method of moments estimator, I find that resource-abundance improves the quality of institutions, as long as income and wealth disparity remains below a certain threshold. When inequality moves beyond this threshold, the positive effects of the resource-abundance level on institutions diminish quickly and turn negative eventually. This paper, thus, provides robust evidence about the endogeneity of institutions and the role income and wealth inequality plays in the determination of long-run growth rates. The third paper sets up a dynamic general equilibrium model with heterogeneous agents to investigate the causal channels which run from a concern for international status to long-run economic growth. The simulation results show that the initial distribution of income and wealth play an important role in whether agents gain or lose from globalization.
Resumo:
Improving the representation of the hydrological cycle in Atmospheric General Circulation Models (AGCMs) is one of the main challenges in modeling the Earth's climate system. One way to evaluate model performance is to simulate the transport of water isotopes. Among those available, tritium (HTO) is an extremely valuable tracer, because its content in the different reservoirs involved in the water cycle (stratosphere, troposphere, ocean) varies by order of magnitude. Previous work incorporated natural tritium into LMDZ-iso, a version of the LMDZ general circulation model enhanced by water isotope diagnostics. Here for the first time, the anthropogenic tritium injected by each of the atmospheric nuclear-bomb tests between 1945 and 1980 has been first estimated and further implemented in the model; it creates an opportunity to evaluate certain aspects of LDMZ over several decades by following the bomb-tritium transient signal through the hydrological cycle. Simulations of tritium in water vapor and precipitation for the period 1950-2008, with both natural and anthropogenic components, are presented in this study. LMDZ-iso satisfactorily reproduces the general shape of the temporal evolution of tritium. However, LMDZ-iso simulates too high a bomb-tritium peak followed by too strong a decrease of tritium in precipitation. The too diffusive vertical advection in AGCMs crucially affects the residence time of tritium in the stratosphere. This insight into model performance demonstrates that the implementation of tritium in an AGCM provides a new and valuable test of the modeled atmospheric transport, complementing water stable isotope modeling.
Resumo:
During the SINOPS project, an optimal state of the art simulation of the marine silicon cycle is attempted employing a biogeochemical ocean general circulation model (BOGCM) through three particular time steps relevant for global (paleo-) climate. In order to tune the model optimally, results of the simulations are compared to a comprehensive data set of 'real' observations. SINOPS' scientific data management ensures that data structure becomes homogeneous throughout the project. Practical work routine comprises systematic progress from data acquisition, through preparation, processing, quality check and archiving, up to the presentation of data to the scientific community. Meta-information and analytical data are mapped by an n-dimensional catalogue in order to itemize the analytical value and to serve as an unambiguous identifier. In practice, data management is carried out by means of the online-accessible information system PANGAEA, which offers a tool set comprising a data warehouse, Graphical Information System (GIS), 2-D plot, cross-section plot, etc. and whose multidimensional data model promotes scientific data mining. Besides scientific and technical aspects, this alliance between scientific project team and data management crew serves to integrate the participants and allows them to gain mutual respect and appreciation.
Resumo:
International tourism is considered an effective means of economic development. However, the effects of tourism are not evenly distributed between rural and urban households in China. In the wake of significant socioeconomic events, the uneven distribution of the economic effects has huge welfare implications for Chinese households. This study is the first attempt to evaluate the distributional effect of two large, recent, sequential events on China's rural and urban households. It adopts an innovative approach that combines an econometric model and a two-household computable general equilibrium model. The results show that in terms of welfare, urban households were more adversely affected by the events than rural households. To mitigate the loss of welfare, measures should be taken to continually promote China as a destination and attract tourists after such events occur. Meanwhile, training and education should be made more accessible to rural households to increase their job opportunities.
Resumo:
Durante el siglo XIII se produjo una sucesión de revueltas que supuso la desaparición del Imperio almohade y su sustitución por poderes regionales en al-Andalus, el Magreb y el Magreb al-Aqsà. La historiografía ha presentado el surgimiento y pugna entre estos poderes como un fenómeno social, político e, incluso, cultural y religioso, con el que se ha podido explicar su aniquilación o marginalización. Este trabajo pretende contextualizar los hechos desde una perspectiva medioambiental, de forma que la desintegración del califato almohade, el surgimiento de aquellos poderes y la progresión de los reinos cristianos en la península ibérica puedan entenderse desde una visión global de cambio climático y una posible crisis agrícola.
Resumo:
Este estudio presenta la validación de las observaciones que realizó el programa de observación pesquera llamado Programa Bitácoras de Pesca (PBP) durante el periodo 2005 - 2011 en el área de distribución donde operan las embarcaciones industriales de cerco dedicadas a la pesca del stock norte-centro de la anchoveta peruana (Engraulis ringens). Además, durante ese mismo periodo y área de distribución, se estimó la magnitud del descarte por exceso de captura, descarte de juveniles y la captura incidental de dicha pesquera. Se observaron 3 768 viajes de un total de 302 859, representando un porcentaje de 1.2 %. Los datos del descarte por exceso de captura, descarte de juveniles y captura incidental registrados en los viajes observados, se caracterizaron por presentar un alta proporción de ceros. Para la validación de las observaciones, se realizó un estudio de simulación basado en la metodología de Monte Carlo usando un modelo de distribución binomial negativo. Esta permite inferir sobre el nivel de cobertura óptima y conocer si la información obtenida en el programa de observación es contable. De este análisis, se concluye que los niveles de observación actual se deberían incrementar hasta tener un nivel de cobertura de al menos el 10% del total de viajes que realicen en el año las embarcaciones industriales de cerco dedicadas a la pesca del stock norte-centro de la anchoveta peruana. La estimación del descarte por exceso de captura, descarte de juveniles y captura incidental se realizó mediante tres metodologías: Bootstrap, Modelo General Lineal (GLM) y Modelo Delta. Cada metodología estimó distintas magnitudes con tendencias similares. Las magnitudes estimadas fueron comparadas usando un ANOVA Bayesiano, la cual muestra que hubo escasa evidencia que las magnitudes estimadas del descarte por exceso de captura por metodología sean diferentes, lo mismo se presentó para el caso de la captura incidental, mientras que para el descarte de juveniles mostró que hubieron diferencias sustanciales de ser diferentes. La metodología que cumplió los supuestos y explico la mayor variabilidad de las variables modeladas fue el Modelo Delta, el cual parece ser una mejor alternativa para la estimación, debido a la alta proporción de ceros en los datos. Las estimaciones promedio del descarte por exceso de captura, descarte de juveniles y captura incidental aplicando el Modelo Delta, fueron 252 580, 41 772, 44 823 toneladas respectivamente, que en conjunto representaron el 5.74% de los desembarques. Además, con la magnitud de la estimación del descarte de juveniles, se realizó un ejercicio de proyección de biomasa bajo el escenario hipotético de no mortalidad por pesca y que los individuos juveniles descartados sólo presentaron tallas de 8 y 11 cm., en la cual se obtuvo que la biomasa que no estará disponible a la pesca está entre los 52 mil y 93 mil toneladas.
Resumo:
Cette thèse examine le rôle du pouvoir de marché dans le marché bancaire. L’emphase est mis sur la prise de risque, les économies d’échelle, l’efficacité économique du marché et la transmission des chocs. Le premier chapitre présente un modèle d’équilibre général dynamique stochastique en économie ouverte comprenant un marché bancaire en concurrence monopolistique. Suivant l’hypothèse de Krugman (1979, 1980) sur la relation entre les économies d’échelle et les exportations, les banques doivent défrayer un coût de transaction pour échanger à l’étranger qui diminue à mesure que le volume de leurs activités locales augmente. Cela incite les banques à réduire leur marge locale afin de profiter davantage du marché extérieur. Le modèle est solutionné et simulé pour divers degrés de concentration dans le marché bancaire. Les résultats obtenus indiquent que deux forces contraires, les économies d’échelle et le pouvoir de marché, s’affrontent lorsque le marché se concentre. La concentration permet aussi aux banques d’accroître leurs activités étrangères, ce qui les rend en contrepartie plus vulnérables aux chocs extérieurs. Le deuxième chapitre élabore un cadre de travail semblable, mais à l’intérieur duquel les banques font face à un risque de crédit. Celui-ci est partiellement assuré par un collatéral fourni par les entrepreneurs et peut être limité à l’aide d’un effort financier. Le modèle est solutionné et simulé pour divers degrés de concentration dans le marché bancaire. Les résultats montrent qu’un plus grand pouvoir de marché réduit la taille du marché financier et de la production à l’état stationnaire, mais incite les banques à prendre moins de risques. De plus, les économies dont le marché bancaire est fortement concentré sont moins sensibles à certains chocs puisque les marges plus élevés donnent initialement de la marge de manoeuvre aux banques en cas de chocs négatifs. Cet effet modérateur est éliminé lorsqu’il est possible pour les banques d’entrer et de sortir librement du marché. Une autre extension avec économies d’échelle montre que sous certaines conditions, un marché moyennement concentré est optimal pour l’économie. Le troisième chapitre utilise un modèle en analyse de portefeuille de type Moyenne-Variance afin de représenter une banque détenant du pouvoir de marché. Le rendement des dépôts et des actifs peut varier selon la quantité échangée, ce qui modifie le choix de portefeuille de la banque. Celle-ci tend à choisir un portefeuille dont la variance est plus faible lorsqu’elle est en mesure d’obtenir un rendement plus élevé sur un actif. Le pouvoir de marché sur les dépôts amène un résultat sembable pour un pouvoir de marché modéré, mais la variance finit par augmenter une fois un certain niveau atteint. Les résultats sont robustes pour différentes fonctions de demandes.
Resumo:
This study investigated the role of fatalism as a cultural value orientation and causal attributions for past failure in the academic performance of high school students in the Araucania Region of Chile. Three thousand three hundred and fourty eight Mapuche and Non-Mapuche students participated in the study. Consistent with the Culture and Behavior model that guided the research, the test of causal models based on the analysis of structural equations show that academic performance is in part a function of variations in the level of fatalism, directly as well as indirectly through its influence in the attribution processes and failure-related emotions. In general, the model representing the proposed structure of relations among fatalism, attributions, and emotions as determinants of academic performance fit the data for both Mapuche and non-Mapuche students. However, results show that some of the relations in the model are different for students from these two ethnic groups. Finally, according to the results from the analysis of causal models, family SES appear to be the most important determinant of fatalism.
Resumo:
Since turning professional in 1995 there have been considerable advances in the research on the demands of rugby union, largely using Global Positioning System (GPS) analysis over the last 10 years. A systematic review on the use of GPS, particularly the setting of absolute (ABS) and individual (IND) velocity bands in field based, intermittent, high-intensity (HI) team sports was undertaken. From 3669 records identified, 38 studies were included for qualitative analysis. Little agreement on the definition of movement intensities within team sports was found, only three papers, all on rugby union, had used IND bands, with only one comparing ABS and IND methods. Thus, the aim of this study was to determine if there is a difference in the demands within positions when comparing ABS and IND methods for GPS analysis and if these differences are significantly different between the forward and back positional groups. A total of 214 data files were recorded from 26 players in 17 matches of the 2015/2016 Scottish BT Premiership. ABS velocity zones 1-7 were set at 1) 0-6, 2) 6.1-11, 3) 11.1-15, 4) 15.1-18, 5) 18.1-21, 6) 21.1-15 and 7) 25.1-40km.h-1 while IND zones 1-7 were 1) <20, 2) 20-40, 3) 40-50, 4) 50-70, 5) 70-80, 6) 80-95 and 7) 95-100% of player’s individually determined maximum velocity (Vmax). A 40m sprint test measured Vmax using OptaPro S4 10 Hz (catapult, Australia) GPS units to derive IND bands. The same GPS units were worn during matches. GPS outputs analysed were % distance, % time, high intensity efforts (HIEs) over 18.1 km.h-1 / 70% max velocity and repeated high intensity efforts (RHIEs) which consists of three HIEs in 21secs. General linear model (GLM) analysis identified a significant difference in the measurement of % total distance covered, between the ABS and IND methods in all zones for forwards (p<0.05) and backs (p<0.05). This difference was also significant between forwards and backs in zones 1, shown as mean difference ± standard deviation (3.7±0.7%), 6 (1.2±0.4%) and 7 (1.0±0.0%) respectively (p<0.05). Percentage time estimations were significantly different between ABS and IND analysis within forwards in zones 1 (1.7±1.7%), 2 (-2.9±1.3%), 3 (1.9±0.8%), 4 (-1.4±0.8%) and 5 (0.2±0.4%), and within backs in zones 1 (-10±1.5%), 2 (-1.2±1.1%), 3 (1.8±0.9%) and 5 (0.6±0.5%) (p<0.05). The difference between groups was significant in zones 1, 2, 4 and 5 (p<0.05). The number of HIEs was significantly different between forwards and backs in zones 6 (6±2) and 7 (3±2). RHIEs were significantly different between ABS and IND for forwards (1±2, p<0.05) although not between groups. Until more research on the differences in ABS and IND methods is carried out, then neither can be deemed a criterion method. In conclusion, there are significant differences between the ABS and IND methods of GPS analysis of the physical demands of rugby union, which must be considered when used to inform training load and recovery to improve performance and reduce injuries.
Resumo:
During the late Miocene, exchange between the Mediterranean Sea and Atlantic Ocean changed dramatically, culminating in the Messinian Salinity Crisis (MSC). Understanding Mediterranean-Atlantic exchange at that time could answer the enigmatic question of how so much salt built up within the Mediterranean, while furthering the development of a framework for future studies attempting to understand how changes may have impacted global thermohaline circulation. Due to their association with specific water masses at different scales, radiogenic Sr, Pb, and Nd isotope records were generated from various archives contained within marine deposits to endeavour to understand better late Miocene Mediterranean-Atlantic exchange. The archives used include foraminiferal calcite (Sr), fish teeth and bone (Nd), dispersed authigenic ferromanganese oxyhydroxides (Nd, Pb), and a ferromanganese crust (Pb). The primary focus is on sediments preserved at one end of the Betic corridor, a gateway that once connected the Mediterranean to the Atlantic through southern Spain, although other locations are investigated. The Betic gateway terminated within several marginal sub-basins before entering the Western Mediterranean; one of these is the Sorbas Basin, a well-studied location whose sediments have been astronomically tuned at high temporal resolution, providing the necessary age control for sub-precessional resolution records. Since the climatic history of the Mediterranean is strongly controlled by precessional changes in regional climate, the aim was to produce records at high (sub-precessional) temporal resolution, to be able to observe clearly any precessional cyclicity driven by regional climate which could be superimposed over longer trends. This goal was achieved for all records except the ferromanganese crust record. The 87Sr/86Sr isotope record (Ch. 3) shows precessional frequency excursions away from the global seawater curve. As precessional frequency oscillations are unexpected for this setting, a numerical box model was used to determine the mechanisms causing the excursions. To enable parameterisation of model variables, regional Sr characteristics, data from general circulation model HadCM3L, and new benthic foraminiferal assemblage data are employed. The model results imply that the Sorbas Basin likely had a positive hydrologic budget in the late Miocene, very different to that of today. Moreover, the model indicates that the mechanism controlling the Sr isotope ratio of Sorbas Basin seawater was not restriction, but a lack of density-driven exchange with the Mediterranean. Beyond improving our understanding of how marginal Mediterranean sub-basins may evolve different isotope signatures, these results have implications for astronomical tuning and stratigraphy in the region, findings which are crucial considering the geological and climatic history of the late Miocene Mediterranean is based entirely on marginal deposits. An improved estimate for the Nd isotope signature of late Miocene Mediterranean Outflow (MO) was determined by comparing Nd isotope signatures preserved in the deeper Alborán Sea at ODP Site 978 with literature data as well as the signature preserved in the Sorbas Basin (Ch. 4; -9.34 to -9.92 ± 0.37 εNd(t)). It was also inferred that it is unlikely that Nd isotopes can be used reliably to track changes in circulation within the shallow settings characteristic of the Mediterranean-Atlantic connections; this is significant in light of a recent publication documenting corridor closure using Nd isotopes. Both conclusions will prove useful for future studies attempting to understand changes in Mediterranean-Atlantic exchange. Excursions to high values, with precessional frequency, are also observed in the radiogenic Pb isotope record for the Sorbas Basin (Ch. 5). Widening the scope to include locations further away from the gateways, records were produced for late Miocene sections on Sicily and Northern Italy, and similar precessional frequency cyclicity was observed in the Pb isotope records for these sites as well. Comparing these records to proxies for Saharan dust and available whole rock data indicates that, while further analysis is necessary to draw strong conclusions, enhanced dust production during insolation minima may be driving the observed signal. These records also have implications for astronomical tuning; peaks in Pb isotope records driven by Saharan dust may be easier to connect directly to the insolation cycle, providing improved astronomical tuning points. Finally, a Pb isotope record derived using in-situ laser ablation performed on ferromanganese crust 3514-6 from the Lion Seamount, located west of Gibraltar within the MO plume, has provided evidence that plume depth shifted during the Pliocene. The record also suggests that Pb isotopes may not be a suitable proxy for changes in late Miocene Mediterranean-Atlantic exchange, since the Pb isotope signatures of regional water masses are too similar. To develop this record, the first published instance of laser ablation derived 230Thexcess measurements are combined with 10Be dating.
Resumo:
Understanding how imperfect information affects firms' investment decision helps answer important questions in economics, such as how we may better measure economic uncertainty; how firms' forecasts would affect their decision-making when their beliefs are not backed by economic fundamentals; and how important are the business cycle impacts of changes in firms' productivity uncertainty in an environment of incomplete information. This dissertation provides a synthetic answer to all these questions, both empirically and theoretically. The first chapter, provides empirical evidence to demonstrate that survey-based forecast dispersion identifies a distinctive type of second moment shocks different from the canonical volatility shocks to productivity, i.e. uncertainty shocks. Such forecast disagreement disturbances can affect the distribution of firm-level beliefs regardless of whether or not belief changes are backed by changes in economic fundamentals. At the aggregate level, innovations that increase the dispersion of firms' forecasts lead to persistent declines in aggregate investment and output, which are followed by a slow recovery. On the contrary, the larger dispersion of future firm-specific productivity innovations, the standard way to measure economic uncertainty, delivers the ``wait and see" effect, such that aggregate investment experiences a sharp decline, followed by a quick rebound, and then overshoots. At the firm level, data uncovers that more productive firms increase investments given rises in productivity dispersion for the future, whereas investments drop when firms disagree more about the well-being of their future business conditions. These findings challenge the view that the dispersion of the firms' heterogeneous beliefs captures the concept of economic uncertainty, defined by a model of uncertainty shocks. The second chapter presents a general equilibrium model of heterogeneous firms subject to the real productivity uncertainty shocks and informational disagreement shocks. As firms cannot perfectly disentangle aggregate from idiosyncratic productivity because of imperfect information, information quality thus drives the wedge of difference between the unobserved productivity fundamentals, and the firms' beliefs about how productive they are. Distribution of the firms' beliefs is no longer perfectly aligned with the distribution of firm-level productivity across firms. This model not only explains why, at the macro and micro level, disagreement shocks are different from uncertainty shocks, as documented in Chapter 1, but helps reconcile a key challenge faced by the standard framework to study economic uncertainty: a trade-off between sizable business cycle effects due to changes in uncertainty, and the right amount of pro-cyclicality of firm-level investment rate dispersion, as measured by its correlation with the output cycles.
Resumo:
The past several years have seen the surprising and rapid rise of Bitcoin and other “cryptocurrencies.” These are decentralized peer-to-peer networks that allow users to transmit money, tocompose financial instruments, and to enforce contracts between mutually distrusting peers, andthat show great promise as a foundation for financial infrastructure that is more robust, efficientand equitable than ours today. However, it is difficult to reason about the security of cryptocurrencies. Bitcoin is a complex system, comprising many intricate and subtly-interacting protocol layers. At each layer it features design innovations that (prior to our work) have not undergone any rigorous analysis. Compounding the challenge, Bitcoin is but one of hundreds of competing cryptocurrencies in an ecosystem that is constantly evolving. The goal of this thesis is to formally reason about the security of cryptocurrencies, reining in their complexity, and providing well-defined and justified statements of their guarantees. We provide a formal specification and construction for each layer of an abstract cryptocurrency protocol, and prove that our constructions satisfy their specifications. The contributions of this thesis are centered around two new abstractions: “scratch-off puzzles,” and the “blockchain functionality” model. Scratch-off puzzles are a generalization of the Bitcoin “mining” algorithm, its most iconic and novel design feature. We show how to provide secure upgrades to a cryptocurrency by instantiating the protocol with alternative puzzle schemes. We construct secure puzzles that address important and well-known challenges facing Bitcoin today, including wasted energy and dangerous coalitions. The blockchain functionality is a general-purpose model of a cryptocurrency rooted in the “Universal Composability” cryptography theory. We use this model to express a wide range of applications, including transparent “smart contracts” (like those featured in Bitcoin and Ethereum), and also privacy-preserving applications like sealed-bid auctions. We also construct a new protocol compiler, called Hawk, which translates user-provided specifications into privacy-preserving protocols based on zero-knowledge proofs.
Resumo:
In the past few years, there has been a concern among economists and policy makers that increased openness to international trade affects some regions in a country more than others. Recent research has found that local labor markets more exposed to import competition through their initial employment composition experience worse outcomes in several dimensions such as, employment, wages, and poverty. Although there is evidence that regions within a country exhibit variation in the intensity with which they trade with each other and with other countries, trade linkages have been ignored in empirical analyses of the regional effects of trade, which focus on differences in employment composition. In this dissertation, I investigate how local labor markets' trade linkages shape the response of wages to international trade shocks. In the second chapter, I lay out a standard multi-sector general equilibrium model of trade, where domestic regions trade with each other and with the rest of the world. Using this benchmark, I decompose a region's wage change resulting from a national import cost shock into a direct effect on prices, holding other endogenous variables constant, and a series of general equilibrium effects. I argue the direct effect provides a natural measure of exposure to import competition within the model since it summarizes the effect of the shock on a region's wage as a function of initial conditions given by its trade linkages. I call my proposed measure linkage exposure while I refer to the measures used in previous studies as employment exposure. My theoretical analysis also shows that the assumptions previous studies make on trade linkages are not consistent with the standard trade model. In the third chapter, I calibrate the model to the Brazilian economy in 1991--at the beginning of a period of trade liberalization--to perform a series of experiments. In each of them, I reduce the Brazilian import cost by 1 percent in a single sector and I calculate how much of the cross-regional variation in counterfactual wage changes is explained by exposure measures. Over this set of experiments, employment exposure explains, for the median sector, 2 percent of the variation in counterfactual wage changes while linkage exposure explains 44 percent. In addition, I propose an estimation strategy that incorporates trade linkages in the analysis of the effects of trade on observed wages. In the model, changes in wages are completely determined by changes in market access, an endogenous variable that summarizes the real demand faced by a region. I show that a linkage measure of exposure is a valid instrument for changes in market access within Brazil. By using observed wage changes in Brazil between 1991-2000, my estimates imply that a region at the 25th percentile of the change in domestic market access induced by trade liberalization, experiences a 0.6 log points larger wage decline (or smaller wage increase) than a region at the 75th percentile. The estimates from a regression of wages changes on exposure imply that a region at the 25th percentile of exposure experiences a 3 log points larger wage decline (or smaller wage increase) than a region at the 75th percentile. I conclude that estimates based on exposure overstate the negative impact of trade liberalization on wages in Brazil. In the fourth chapter, I extend the standard model to allow for two types of workers according to their education levels: skilled and unskilled. I show that there is substantial variation across Brazilian regions in the skill premium. I use the exogenous variation provided by tariff changes to estimate the impact of market access on the skill premium. I find that decreased domestic market access resulting from trade liberalization resulted in a higher skill premium. I propose a mechanism to explain this result: that the manufacturing sector is relatively more intensive in unskilled labor and I show empirical evidence that supports this hypothesis.
Resumo:
Foreseeing functional recovery after stroke plays a crucial role in planning rehabilitation programs. Objectives: To assess differences over time in functional recovery assessed through the Barthel Index (BI) rate of change (BIRC) between admission and discharge in stroke patients. Methods: This is a retrospective hospital-based study of consecutive patients with acute stroke admitted to a hospital in the Northeast Portugal between 2010 and 2014. BIRC was computed as the difference between the admission and discharge BI scores divided by time in days between these assessments. General linear model analysis stratiied by gender was used to know whether there was an increase in BIRC during time period under study. Adjusted regression coeficients and respective 95% conidence interval (95%CI) were obtained. Results: From 483 patients included in this analysis 59% (n = 285) were male. Among women, mean BIRC was 1.8 (± 1.88) units/ day in 2010 and reached 3.7 (± 2.80) units/day in 2014. Among men the mean BIRC in 2010 and in 2014 were similar being 3.2 (± 3.19) and 3.1 (± 3.31) units/day, respectively. After adjustment for age, BI at admission, type and laterality of stroke we observed an increase in BIRC over time among women such that mean BIRC in 2014 was 0.82 (95%: 0.48; 3.69) units higher than the one observed in 2010. No such increase in BIRC over time was observed among men. Conclusions: We observed an improvement in functional recovery after stroke but only among women. Our results suggest differences over time in clinical practice toward rehabilitation of women after stroke.