978 resultados para Macro-econometric model
Resumo:
Changes in marine net primary productivity (PP) and export of particulate organic carbon (EP) are projected over the 21st century with four global coupled carbon cycle-climate models. These include representations of marine ecosystems and the carbon cycle of different structure and complexity. All four models show a decrease in global mean PP and EP between 2 and 20% by 2100 relative to preindustrial conditions, for the SRES A2 emission scenario. Two different regimes for productivity changes are consistently identified in all models. The first chain of mechanisms is dominant in the low- and mid-latitude ocean and in the North Atlantic: reduced input of macro-nutrients into the euphotic zone related to enhanced stratification, reduced mixed layer depth, and slowed circulation causes a decrease in macro-nutrient concentrations and in PP and EP. The second regime is projected for parts of the Southern Ocean: an alleviation of light and/or temperature limitation leads to an increase in PP and EP as productivity is fueled by a sustained nutrient input. A region of disagreement among the models is the Arctic, where three models project an increase in PP while one model projects a decrease. Projected changes in seasonal and interannual variability are modest in most regions. Regional model skill metrics are proposed to generate multi-model mean fields that show an improved skill in representing observation-based estimates compared to a simple multi-model average. Model results are compared to recent productivity projections with three different algorithms, usually applied to infer net primary production from satellite observations.
Resumo:
This paper presents a kernel density correlation based nonrigid point set matching method and shows its application in statistical model based 2D/3D reconstruction of a scaled, patient-specific model from an un-calibrated x-ray radiograph. In this method, both the reference point set and the floating point set are first represented using kernel density estimates. A correlation measure between these two kernel density estimates is then optimized to find a displacement field such that the floating point set is moved to the reference point set. Regularizations based on the overall deformation energy and the motion smoothness energy are used to constraint the displacement field for a robust point set matching. Incorporating this non-rigid point set matching method into a statistical model based 2D/3D reconstruction framework, we can reconstruct a scaled, patient-specific model from noisy edge points that are extracted directly from the x-ray radiograph by an edge detector. Our experiment conducted on datasets of two patients and six cadavers demonstrates a mean reconstruction error of 1.9 mm
Resumo:
The study performs a panel estimation of the relationship between per capita income, trade, and airborne pollution in the five Central Asian nations, Russia and China between 1992 and 2008. First, this study uses an environmental Kuznets curve hypothesis (EKC)- an inverted-U relationship between the increase in income and the level of environmental degradation - to examine how income and pollution are related. Second, the study uses a gravity model to estimate the effect of a regional trade agreement (Shanghai Cooperation Organization: SCO) on incomes and carbon dioxide emissions in the region. Empirical analysis confirms the existence of the rising portion of the EKC curve in the region - a positive correlation between per capita income growth and carbon dioxide emissions- and that the volume of bilateral trade, and not the existence of a regional trade agreement, contributes to the increasing level of environmental pollution.
Resumo:
The study performs a panel estimation of the relationship between per capita income, trade, and airborne pollution in the five Central Asian nations, Russia and China between 1992 and 2008. First, this study uses an environmental Kuznets curve hypothesis (EKC)- an inverted-U relationship between the increase in income and the level of environmental degradation - to examine how income and pollution are related. Second, the study uses a gravity model to estimate the effect of a regional trade agreement (Shanghai Cooperation Organization: SCO) on incomes and carbon dioxide emissions in the region. Empirical analysis confirms the existence of the rising portion of the EKC curve in the region - a positive correlation between per capita income growth and carbon dioxide emissions- and that the volume of bilateral trade, and not the existence of a regional trade agreement, contributes to the increasing level of environmental pollution.
Resumo:
PURPOSE Modulated electron radiotherapy (MERT) promises sparing of organs at risk for certain tumor sites. Any implementation of MERT treatment planning requires an accurate beam model. The aim of this work is the development of a beam model which reconstructs electron fields shaped using the Millennium photon multileaf collimator (MLC) (Varian Medical Systems, Inc., Palo Alto, CA) for a Varian linear accelerator (linac). METHODS This beam model is divided into an analytical part (two photon and two electron sources) and a Monte Carlo (MC) transport through the MLC. For dose calculation purposes the beam model has been coupled with a macro MC dose calculation algorithm. The commissioning process requires a set of measurements and precalculated MC input. The beam model has been commissioned at a source to surface distance of 70 cm for a Clinac 23EX (Varian Medical Systems, Inc., Palo Alto, CA) and a TrueBeam linac (Varian Medical Systems, Inc., Palo Alto, CA). For validation purposes, measured and calculated depth dose curves and dose profiles are compared for four different MLC shaped electron fields and all available energies. Furthermore, a measured two-dimensional dose distribution for patched segments consisting of three 18 MeV segments, three 12 MeV segments, and a 9 MeV segment is compared with corresponding dose calculations. Finally, measured and calculated two-dimensional dose distributions are compared for a circular segment encompassed with a C-shaped segment. RESULTS For 15 × 34, 5 × 5, and 2 × 2 cm(2) fields differences between water phantom measurements and calculations using the beam model coupled with the macro MC dose calculation algorithm are generally within 2% of the maximal dose value or 2 mm distance to agreement (DTA) for all electron beam energies. For a more complex MLC pattern, differences between measurements and calculations are generally within 3% of the maximal dose value or 3 mm DTA for all electron beam energies. For the two-dimensional dose comparisons, the differences between calculations and measurements are generally within 2% of the maximal dose value or 2 mm DTA. CONCLUSIONS The results of the dose comparisons suggest that the developed beam model is suitable to accurately reconstruct photon MLC shaped electron beams for a Clinac 23EX and a TrueBeam linac. Hence, in future work the beam model will be utilized to investigate the possibilities of MERT using the photon MLC to shape electron beams.
Resumo:
The paper considers panel data methods for estimating ordered logit models with individual-specific correlated unobserved heterogeneity. We show that a popular approach is inconsistent, whereas some consistent and efficient estimators are available, including minimum distance and generalized method-of-moment estimators. A Monte Carlo study reveals the good properties of an alternative estimator that has not been considered in econometric applications before, is simple to implement and almost as efficient. An illustrative application based on data from the German Socio-Economic Panel confirms the large negative effect of unemployment on life satisfaction that has been found in the previous literature.
Resumo:
Public preferences for policy are formed in a little-understood process that is not adequately described by traditional economic theory of choice. In this paper I suggest that U.S. aggregate support for health reform can be modeled as tradeoffs among a small number of behavioral values and the stage of policy development. The theory underlying the model is based on Samuelson, et al.'s (1986) work and Wilke's (1991) elaboration of it as the Greed/Efficiency/Fairness (GEF) hypothesis of motivation in the management of resource dilemmas, and behavioral economics informed by Kahneman and Thaler's prospect theory. ^ The model developed in this paper employs ordered probit econometric techniques applied to data derived from U.S. polls taken from 1990 to mid-2003 that measured support for health reform proposals. Outcome data are four-tiered Likert counts; independent variables are dummies representing the presence or absence of operationalizations of each behavioral variable, along with an integer representing policy process stage. Marginal effects of each independent variable predict how support levels change on triggering that variable. Model estimation results indicate a vanishingly small likelihood that all coefficients are zero and all variables have signs expected from model theory. ^ Three hypotheses were tested: support will drain from health reform policy as it becomes increasingly well-articulated and approaches enactment; reforms appealing to fairness through universal health coverage will enjoy a higher degree of support than those targeted more narrowly; health reforms calling for government operation of the health finance system will achieve lower support than those that do not. Model results support the first and last hypotheses. Contrary to expectations, universal health care proposals did not provide incremental support beyond those targeted to “deserving” populations—children, elderly, working families. In addition, loss of autonomy (e.g. restrictions on choice of care giver) is found to be the “third rail” of health reform with significantly-reduced support. When applied to a hypothetical health reform in which an employer-mandated Medical Savings Account policy is the centerpiece, the model predicts support that may be insufficient to enactment. These results indicate that the method developed in the paper may prove valuable to health policy designers. ^
Macroeconomic Rationality and Lucas' Misperceptions Model: Further Evidence from Forty-One Countries
Resumo:
Several researchers have examined Lucas's misperceptions model as well as various propositions derived from it within a cross-section empirical framework. The cross-section approach imposes a single monetary policy regime for the entire period. Our paper innovates on existing tests of those rational expectations propositions by allowing the simultaneous effect of monetary and short run aggregate supply (oil price) shocks on output behavior and the employment of advanced panel econometric techniques. Our empirical findings, for a sample of 41 countries over 1949 to 1999, provide evidence in favor of the majority of rational expectations propositions.
Resumo:
The effect of infill walls on the behaviour of frames is widely recognized, and, for several decades now, has been the subject of numerous experimental investigations. However, the analytical modeling of infilled panels and frames under in-plane loading is difficult and generally unreliable. From the point of view of the simulation technique the models may be divided into micromodels and simplified (or macro-) models. Based on the equivalent strut approach (simplified model), in this paper a damage model is proposed for the characterization of masonry walls submitted to lateral cyclic loads. The model, developed along the lines of the Continuum Damage Mechanics, have the advantages of including explicitly the coupling between damage and mechanical behaviour and so is consistent with the definition of damage as a phenomenon with mechanical consequences.
Resumo:
Los accidentes del tráfico son un fenómeno social muy relevantes y una de las principales causas de mortalidad en los países desarrollados. Para entender este fenómeno complejo se aplican modelos econométricos sofisticados tanto en la literatura académica como por las administraciones públicas. Esta tesis está dedicada al análisis de modelos macroscópicos para los accidentes del tráfico en España. El objetivo de esta tesis se puede dividir en dos bloques: a. Obtener una mejor comprensión del fenómeno de accidentes de trafico mediante la aplicación y comparación de dos modelos macroscópicos utilizados frecuentemente en este área: DRAG y UCM, con la aplicación a los accidentes con implicación de furgonetas en España durante el período 2000-2009. Los análisis se llevaron a cabo con enfoque frecuencista y mediante los programas TRIO, SAS y TRAMO/SEATS. b. La aplicación de modelos y la selección de las variables más relevantes, son temas actuales de investigación y en esta tesis se ha desarrollado y aplicado una metodología que pretende mejorar, mediante herramientas teóricas y prácticas, el entendimiento de selección y comparación de los modelos macroscópicos. Se han desarrollado metodologías tanto para selección como para comparación de modelos. La metodología de selección de modelos se ha aplicado a los accidentes mortales ocurridos en la red viaria en el período 2000-2011, y la propuesta metodológica de comparación de modelos macroscópicos se ha aplicado a la frecuencia y la severidad de los accidentes con implicación de furgonetas en el período 2000-2009. Como resultado de los desarrollos anteriores se resaltan las siguientes contribuciones: a. Profundización de los modelos a través de interpretación de las variables respuesta y poder de predicción de los modelos. El conocimiento sobre el comportamiento de los accidentes con implicación de furgonetas se ha ampliado en este proceso. bl. Desarrollo de una metodología para selección de variables relevantes para la explicación de la ocurrencia de accidentes de tráfico. Teniendo en cuenta los resultados de a) la propuesta metodológica se basa en los modelos DRAG, cuyos parámetros se han estimado con enfoque bayesiano y se han aplicado a los datos de accidentes mortales entre los años 2000-2011 en España. Esta metodología novedosa y original se ha comparado con modelos de regresión dinámica (DR), que son los modelos más comunes para el trabajo con procesos estocásticos. Los resultados son comparables, y con la nueva propuesta se realiza una aportación metodológica que optimiza el proceso de selección de modelos, con escaso coste computacional. b2. En la tesis se ha diseñado una metodología de comparación teórica entre los modelos competidores mediante la aplicación conjunta de simulación Monte Cario, diseño de experimentos y análisis de la varianza ANOVA. Los modelos competidores tienen diferentes estructuras, que afectan a la estimación de efectos de las variables explicativas. Teniendo en cuenta el estudio desarrollado en bl) este desarrollo tiene el propósito de determinar como interpretar la componente de tendencia estocástica que un modelo UCM modela explícitamente, a través de un modelo DRAG, que no tiene un método específico para modelar este elemento. Los resultados de este estudio son importantes para ver si la serie necesita ser diferenciada antes de modelar. b3. Se han desarrollado nuevos algoritmos para realizar los ejercicios metodológicos, implementados en diferentes programas como R, WinBUGS, y MATLAB. El cumplimiento de los objetivos de la tesis a través de los desarrollos antes enunciados se remarcan en las siguientes conclusiones: 1. El fenómeno de accidentes del tráfico se ha analizado mediante dos modelos macroscópicos. Los efectos de los factores de influencia son diferentes dependiendo de la metodología aplicada. Los resultados de predicción son similares aunque con ligera superioridad de la metodología DRAG. 2. La metodología para selección de variables y modelos proporciona resultados prácticos en cuanto a la explicación de los accidentes de tráfico. La predicción y la interpretación también se han mejorado mediante esta nueva metodología. 3. Se ha implementado una metodología para profundizar en el conocimiento de la relación entre las estimaciones de los efectos de dos modelos competidores como DRAG y UCM. Un aspecto muy importante en este tema es la interpretación de la tendencia mediante dos modelos diferentes de la que se ha obtenido información muy útil para los investigadores en el campo del modelado. Los resultados han proporcionado una ampliación satisfactoria del conocimiento en torno al proceso de modelado y comprensión de los accidentes con implicación de furgonetas y accidentes mortales totales en España. ABSTRACT Road accidents are a very relevant social phenomenon and one of the main causes of death in industrialized countries. Sophisticated econometric models are applied in academic work and by the administrations for a better understanding of this very complex phenomenon. This thesis is thus devoted to the analysis of macro models for road accidents with application to the Spanish case. The objectives of the thesis may be divided in two blocks: a. To achieve a better understanding of the road accident phenomenon by means of the application and comparison of two of the most frequently used macro modelings: DRAG (demand for road use, accidents and their gravity) and UCM (unobserved components model); the application was made to van involved accident data in Spain in the period 2000-2009. The analysis has been carried out within the frequentist framework and using available state of the art software, TRIO, SAS and TRAMO/SEATS. b. Concern on the application of the models and on the relevant input variables to be included in the model has driven the research to try to improve, by theoretical and practical means, the understanding on methodological choice and model selection procedures. The theoretical developments have been applied to fatal accidents during the period 2000-2011 and van-involved road accidents in 2000-2009. This has resulted in the following contributions: a. Insight on the models has been gained through interpretation of the effect of the input variables on the response and prediction accuracy of both models. The behavior of van-involved road accidents has been explained during this process. b1. Development of an input variable selection procedure, which is crucial for an efficient choice of the inputs. Following the results of a) the procedure uses the DRAG-like model. The estimation is carried out within the Bayesian framework. The procedure has been applied for the total road accident data in Spain in the period 2000-2011. The results of the model selection procedure are compared and validated through a dynamic regression model given that the original data has a stochastic trend. b2. A methodology for theoretical comparison between the two models through Monte Carlo simulation, computer experiment design and ANOVA. The models have a different structure and this affects the estimation of the effects of the input variables. The comparison is thus carried out in terms of the effect of the input variables on the response, which is in general different, and should be related. Considering the results of the study carried out in b1) this study tries to find out how a stochastic time trend will be captured in DRAG model, since there is no specific trend component in DRAG. Given the results of b1) the findings of this study are crucial in order to see if the estimation of data with stochastic component through DRAG will be valid or whether the data need a certain adjustment (typically differencing) prior to the estimation. The model comparison methodology was applied to the UCM and DRAG models, considering that, as mentioned above, the UCM has a specific trend term while DRAG does not. b3. New algorithms were developed for carrying out the methodological exercises. For this purpose different softwares, R, WinBUGs and MATLAB were used. These objectives and contributions have been resulted in the following findings: 1. The road accident phenomenon has been analyzed by means of two macro models: The effects of the influential input variables may be estimated through the models, but it has been observed that the estimates vary from one model to the other, although prediction accuracy is similar, with a slight superiority of the DRAG methodology. 2. The variable selection methodology provides very practical results, as far as the explanation of road accidents is concerned. Prediction accuracy and interpretability have been improved by means of a more efficient input variable and model selection procedure. 3. Insight has been gained on the relationship between the estimates of the effects using the two models. A very relevant issue here is the role of trend in both models, relevant recommendations for the analyst have resulted from here. The results have provided a very satisfactory insight into both modeling aspects and the understanding of both van-involved and total fatal accidents behavior in Spain.
Resumo:
Nowadays robots have made their way into real applications that were prohibitive and unthinkable thirty years ago. This is mainly due to the increase in power computations and the evolution in the theoretical field of robotics and control. Even though there is plenty of information in the current literature on this topics, it is not easy to find clear concepts of how to proceed in order to design and implement a controller for a robot. In general, the design of a controller requires of a complete understanding and knowledge of the system to be controlled. Therefore, for advanced control techniques the systems must be first identified. Once again this particular objective is cumbersome and is never straight forward requiring of great expertise and some criteria must be adopted. On the other hand, the particular problem of designing a controller is even more complex when dealing with Parallel Manipulators (PM), since their closed-loop structures give rise to a highly nonlinear system. Under this basis the current work is developed, which intends to resume and gather all the concepts and experiences involve for the control of an Hydraulic Parallel Manipulator. The main objective of this thesis is to provide a guide remarking all the steps involve in the designing of advanced control technique for PMs. The analysis of the PM under study is minced up to the core of the mechanism: the hydraulic actuators. The actuators are modeled and experimental identified. Additionally, some consideration regarding traditional PID controllers are presented and an adaptive controller is finally implemented. From a macro perspective the kinematic and dynamic model of the PM are presented. Based on the model of the system and extending the adaptive controller of the actuator, a control strategy for the PM is developed and its performance is analyzed with simulation.
Resumo:
The number of distressed manufacturing firms increased sharply during recessionary phase 2009-13. Financial indebtness traditionally plays a key role in assessing firm solvency but contagion effects that originate from the supply chain are usually neglected in literature. Firm interconnections, captured via the trade credit channel, represent a primary vehicle of individual shocks’ propagation, especially during an economic downturn, when liquidity tensions arise. A representative sample of 11,920 Italian manufacturing firms is considered to model a two-step econometric design, where chain reactions in terms of trade credit accumulation (i.e. default of payments to suppliers) are primarily analyzed by resorting to a spatial autoregressive approach (SAR). Spatial interactions are modeled based on a unique dataset of firm-to-firm transactions registered before the outbreak of the crisis. The second step in instead a binary outcome model where trade credit chains are considered together with data on the bank-firm relationship to assess determinants of distress likelihoods in 2009-13. Results show that outstanding trade debt is affected by the liquidity position of a firm and by positive spatial effects. Trade credit chain reactions are found to exert, in turn, a positive impact on distress likelihoods during the crisis. The latter effect is comparable in magnitude to the one exerted by individual financial rigidity, and stresses the importance to include complex interactions between firms in the analysis of the solvency behavior.
Resumo:
It is well known that that there is an intrinsic link between the financial and energy sectors, which can be analyzed through their spillover effects, which are measures of how the shocks to returns in different assets affect each other’s subsequent volatility in both spot and futures markets. Financial derivatives, which are not only highly representative of the underlying indices but can also be traded on both the spot and futures markets, include Exchange Traded Funds (ETFs), which is a tradable spot index whose aim is to replicate the return of an underlying benchmark index. When ETF futures are not available to examine spillover effects, “generated regressors” may be used to construct both Financial ETF futures and Energy ETF futures. The purpose of the paper is to investigate the covolatility spillovers within and across the US energy and financial sectors in both spot and futures markets, by using “generated regressors” and a multivariate conditional volatility model, namely Diagonal BEKK. The daily data used are from 1998/12/23 to 2016/4/22. The data set is analyzed in its entirety, and also subdivided into three subset time periods. The empirical results show there is a significant relationship between the Financial ETF and Energy ETF in the spot and futures markets. Therefore, financial and energy ETFs are suitable for constructing a financial portfolio from an optimal risk management perspective, and also for dynamic hedging purposes.
Resumo:
This paper provides a theoretical model of the influence of economic crises on tourism destination performance. It discusses the temporary and permanent effects of economic crises on the global market shares of tourism destinations through a series of potential transmission mechanisms based on the main economic competitiveness determinants identified in the literature. The proposed model explains the non-neutrality of economic shocks in tourism competitiveness. The model is tested on Spain's tourism industry, which is among the leaders of the global tourism sector, for the period 1970–2013 using non-linear econometric techniques. The empirical analysis confirms that the proposed model is appropriate for explaining the changes in the market positions caused by the economic crises.