6 resultados para Multi-sector models

em DRUM (Digital Repository at the University of Maryland)


Relevância:

80.00% 80.00%

Publicador:

Resumo:

In the past few years, there has been a concern among economists and policy makers that increased openness to international trade affects some regions in a country more than others. Recent research has found that local labor markets more exposed to import competition through their initial employment composition experience worse outcomes in several dimensions such as, employment, wages, and poverty. Although there is evidence that regions within a country exhibit variation in the intensity with which they trade with each other and with other countries, trade linkages have been ignored in empirical analyses of the regional effects of trade, which focus on differences in employment composition. In this dissertation, I investigate how local labor markets' trade linkages shape the response of wages to international trade shocks. In the second chapter, I lay out a standard multi-sector general equilibrium model of trade, where domestic regions trade with each other and with the rest of the world. Using this benchmark, I decompose a region's wage change resulting from a national import cost shock into a direct effect on prices, holding other endogenous variables constant, and a series of general equilibrium effects. I argue the direct effect provides a natural measure of exposure to import competition within the model since it summarizes the effect of the shock on a region's wage as a function of initial conditions given by its trade linkages. I call my proposed measure linkage exposure while I refer to the measures used in previous studies as employment exposure. My theoretical analysis also shows that the assumptions previous studies make on trade linkages are not consistent with the standard trade model. In the third chapter, I calibrate the model to the Brazilian economy in 1991--at the beginning of a period of trade liberalization--to perform a series of experiments. In each of them, I reduce the Brazilian import cost by 1 percent in a single sector and I calculate how much of the cross-regional variation in counterfactual wage changes is explained by exposure measures. Over this set of experiments, employment exposure explains, for the median sector, 2 percent of the variation in counterfactual wage changes while linkage exposure explains 44 percent. In addition, I propose an estimation strategy that incorporates trade linkages in the analysis of the effects of trade on observed wages. In the model, changes in wages are completely determined by changes in market access, an endogenous variable that summarizes the real demand faced by a region. I show that a linkage measure of exposure is a valid instrument for changes in market access within Brazil. By using observed wage changes in Brazil between 1991-2000, my estimates imply that a region at the 25th percentile of the change in domestic market access induced by trade liberalization, experiences a 0.6 log points larger wage decline (or smaller wage increase) than a region at the 75th percentile. The estimates from a regression of wages changes on exposure imply that a region at the 25th percentile of exposure experiences a 3 log points larger wage decline (or smaller wage increase) than a region at the 75th percentile. I conclude that estimates based on exposure overstate the negative impact of trade liberalization on wages in Brazil. In the fourth chapter, I extend the standard model to allow for two types of workers according to their education levels: skilled and unskilled. I show that there is substantial variation across Brazilian regions in the skill premium. I use the exogenous variation provided by tariff changes to estimate the impact of market access on the skill premium. I find that decreased domestic market access resulting from trade liberalization resulted in a higher skill premium. I propose a mechanism to explain this result: that the manufacturing sector is relatively more intensive in unskilled labor and I show empirical evidence that supports this hypothesis.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Experiments with ultracold atoms in optical lattice have become a versatile testing ground to study diverse quantum many-body Hamiltonians. A single-band Bose-Hubbard (BH) Hamiltonian was first proposed to describe these systems in 1998 and its associated quantum phase-transition was subsequently observed in 2002. Over the years, there has been a rapid progress in experimental realizations of more complex lattice geometries, leading to more exotic BH Hamiltonians with contributions from excited bands, and modified tunneling and interaction energies. There has also been interesting theoretical insights and experimental studies on “un- conventional” Bose-Einstein condensates in optical lattices and predictions of rich orbital physics in higher bands. In this thesis, I present our results on several multi- band BH models and emergent quantum phenomena. In particular, I study optical lattices with two local minima per unit cell and show that the low energy states of a multi-band BH Hamiltonian with only pairwise interactions is equivalent to an effec- tive single-band Hamiltonian with strong three-body interactions. I also propose a second method to create three-body interactions in ultracold gases of bosonic atoms in a optical lattice. In this case, this is achieved by a careful cancellation of two contributions in the pair-wise interaction between the atoms, one proportional to the zero-energy scattering length and a second proportional to the effective range. I subsequently study the physics of Bose-Einstein condensation in the second band of a double-well 2D lattice and show that the collision aided decay rate of the con- densate to the ground band is smaller than the tunneling rate between neighboring unit cells. Finally, I propose a numerical method using the discrete variable repre- sentation for constructing real-valued Wannier functions localized in a unit cell for optical lattices. The developed numerical method is general and can be applied to a wide array of optical lattice geometries in one, two or three dimensions.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Energy Conservation Measure (ECM) project selection is made difficult given real-world constraints, limited resources to implement savings retrofits, various suppliers in the market and project financing alternatives. Many of these energy efficient retrofit projects should be viewed as a series of investments with annual returns for these traditionally risk-averse agencies. Given a list of ECMs available, federal, state and local agencies must determine how to implement projects at lowest costs. The most common methods of implementation planning are suboptimal relative to cost. Federal, state and local agencies can obtain greater returns on their energy conservation investment over traditional methods, regardless of the implementing organization. This dissertation outlines several approaches to improve the traditional energy conservations models. Any public buildings in regions with similar energy conservation goals in the United States or internationally can also benefit greatly from this research. Additionally, many private owners of buildings are under mandates to conserve energy e.g., Local Law 85 of the New York City Energy Conservation Code requires any building, public or private, to meet the most current energy code for any alteration or renovation. Thus, both public and private stakeholders can benefit from this research. The research in this dissertation advances and presents models that decision-makers can use to optimize the selection of ECM projects with respect to the total cost of implementation. A practical application of a two-level mathematical program with equilibrium constraints (MPEC) improves the current best practice for agencies concerned with making the most cost-effective selection leveraging energy services companies or utilities. The two-level model maximizes savings to the agency and profit to the energy services companies (Chapter 2). An additional model presented leverages a single congressional appropriation to implement ECM projects (Chapter 3). Returns from implemented ECM projects are used to fund additional ECM projects. In these cases, fluctuations in energy costs and uncertainty in the estimated savings severely influence ECM project selection and the amount of the appropriation requested. A risk aversion method proposed imposes a minimum on the number of “of projects completed in each stage. A comparative method using Conditional Value at Risk is analyzed. Time consistency was addressed in this chapter. This work demonstrates how a risk-based, stochastic, multi-stage model with binary decision variables at each stage provides a much more accurate estimate for planning than the agency’s traditional approach and deterministic models. Finally, in Chapter 4, a rolling-horizon model allows for subadditivity and superadditivity of the energy savings to simulate interactive effects between ECM projects. The approach makes use of inequalities (McCormick, 1976) to re-express constraints that involve the product of binary variables with an exact linearization (related to the convex hull of those constraints). This model additionally shows the benefits of learning between stages while remaining consistent with the single congressional appropriations framework.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The U.S. Nuclear Regulatory Commission implemented a safety goal policy in response to the 1979 Three Mile Island accident. This policy addresses the question “How safe is safe enough?” by specifying quantitative health objectives (QHOs) for comparison with results from nuclear power plant (NPP) probabilistic risk analyses (PRAs) to determine whether proposed regulatory actions are justified based on potential safety benefit. Lessons learned from recent operating experience—including the 2011 Fukushima accident—indicate that accidents involving multiple units at a shared site can occur with non-negligible frequency. Yet risk contributions from such scenarios are excluded by policy from safety goal evaluations—even for the nearly 60% of U.S. NPP sites that include multiple units. This research develops and applies methods for estimating risk metrics for comparison with safety goal QHOs using models from state-of-the-art consequence analyses to evaluate the effect of including multi-unit accident risk contributions in safety goal evaluations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation focuses on design challenges caused by secondary impacts to printed wiring assemblies (PWAs) within hand-held electronics due to accidental drop or impact loading. The continuing increase of functionality, miniaturization and affordability has resulted in a decrease in the size and weight of handheld electronic products. As a result, PWAs have become thinner and the clearances between surrounding structures have decreased. The resulting increase in flexibility of the PWAs in combination with the reduced clearances requires new design rules to minimize and survive possible internal collisions impacts between PWAs and surrounding structures. Such collisions are being termed ‘secondary impact’ in this study. The effect of secondary impact on board-level drop reliability of printed wiring boards (PWBs) assembled with MEMS microphone components, is investigated using a combination of testing, response and stress analysis, and damage modeling. The response analysis is conducted using a combination of numerical finite element modeling and simplified analytic models for additional parametric sensitivity studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The predictive capabilities of computational fire models have improved in recent years such that models have become an integral part of many research efforts. Models improve the understanding of the fire risk of materials and may decrease the number of expensive experiments required to assess the fire hazard of a specific material or designed space. A critical component of a predictive fire model is the pyrolysis sub-model that provides a mathematical representation of the rate of gaseous fuel production from condensed phase fuels given a heat flux incident to the material surface. The modern, comprehensive pyrolysis sub-models that are common today require the definition of many model parameters to accurately represent the physical description of materials that are ubiquitous in the built environment. Coupled with the increase in the number of parameters required to accurately represent the pyrolysis of materials is the increasing prevalence in the built environment of engineered composite materials that have never been measured or modeled. The motivation behind this project is to develop a systematic, generalized methodology to determine the requisite parameters to generate pyrolysis models with predictive capabilities for layered composite materials that are common in industrial and commercial applications. This methodology has been applied to four common composites in this work that exhibit a range of material structures and component materials. The methodology utilizes a multi-scale experimental approach in which each test is designed to isolate and determine a specific subset of the parameters required to define a material in the model. Data collected in simultaneous thermogravimetry and differential scanning calorimetry experiments were analyzed to determine the reaction kinetics, thermodynamic properties, and energetics of decomposition for each component of the composite. Data collected in microscale combustion calorimetry experiments were analyzed to determine the heats of complete combustion of the volatiles produced in each reaction. Inverse analyses were conducted on sample temperature data collected in bench-scale tests to determine the thermal transport parameters of each component through degradation. Simulations of quasi-one-dimensional bench-scale gasification tests generated from the resultant models using the ThermaKin modeling environment were compared to experimental data to independently validate the models.