977 resultados para Variable structure controller
Resumo:
Fractal geometry is a fundamental approach for describing the complex irregularities of the spatial structure of point patterns. The present research characterizes the spatial structure of the Swiss population distribution in the three Swiss geographical regions (Alps, Plateau and Jura) and at the entire country level. These analyses were carried out using fractal and multifractal measures for point patterns, which enabled the estimation of the spatial degree of clustering of a distribution at different scales. The Swiss population dataset is presented on a grid of points and thus it can be modelled as a "point process" where each point is characterized by its spatial location (geometrical support) and a number of inhabitants (measured variable). The fractal characterization was performed by means of the box-counting dimension and the multifractal analysis was conducted through the Renyi's generalized dimensions and the multifractal spectrum. Results showed that the four population patterns are all multifractals and present different clustering behaviours. Applying multifractal and fractal methods at different geographical regions and at different scales allowed us to quantify and describe the dissimilarities between the four structures and their underlying processes. This paper is the first Swiss geodemographic study applying multifractal methods using high resolution data.
Resumo:
Report for the scientific sojourn at the James Cook University, Australia, between June to December 2007. Free convection in enclosed spaces is found widely in natural and industrial systems. It is a topic of primary interest because in many systems it provides the largest resistance to the heat transfer in comparison with other heat transfer modes. In such systems the convection is driven by a density gradient within the fluid, which, usually, is produced by a temperature difference between the fluid and surrounding walls. In the oil industry, the oil, which has High Prandtl, usually is stored and transported in large tanks at temperatures high enough to keep its viscosity and, thus the pumping requirements, to a reasonable level. A temperature difference between the fluid and the walls of the container may give rise to the unsteady buoyancy force and hence the unsteady natural convection. In the initial period of cooling the natural convection regime dominates over the conduction contribution. As the oil cools down it typically becomes more viscous and this increase of viscosity inhibits the convection. At this point the oil viscosity becomes very large and unloading of the tank becomes very difficult. For this reason it is of primary interest to be able to predict the cooling rate of the oil. The general objective of this work is to develop and validate a simulation tool able to predict the cooling rates of high Prandtl fluid considering the variable viscosity effects.
Resumo:
This paper studies the quantitative implications of changes in the composition of taxes for long-run growth and expected lifetime utility in the UK economy over 1970-2005. Our setup is a dynamic stochastic general equilibrium model incorporating a detailed scal policy struc- ture, and where the engine of endogenous growth is human capital accumulation. The government s spending instruments include pub- lic consumption, investment and education spending. On the revenue side, labour, capital and consumption taxes are employed. Our results suggest that if the goal of tax policy is to promote long-run growth by altering relative tax rates, then it should reduce labour taxes while simultaneously increasing capital or consumption taxes to make up for the loss in labour tax revenue. In contrast, a welfare promoting policy would be to cut capital taxes, while concurrently increasing labour or consumption taxes to make up for the loss in capital tax revenue.
Resumo:
This paper investigates the importance of political ideology and opportunism in the choice of the tax structure. In particular, we examine the effects of cabinet ideology and elections on the distribution of the tax burden across factors of production and consumption for 21 OECD countries over the period 1970-2000 by employing four alternative cabinet ideology measures and by using the methodology of effective tax rates. There is evidence of both opportunistic and partisan effects on tax policies. More precisely, we find that left-wing governments rely more on capital relative to labor income taxation and that they tend to increase consumption taxes. Moreover, we find that income tax rates (but not consumption taxes) tend to be reduced in preelectoral periods and that capital effective tax rates (defined broadly to include taxes on selfemployed income) are reduced by more than effective labor tax rates.
Resumo:
This paper develops methods for Stochastic Search Variable Selection (currently popular with regression and Vector Autoregressive models) for Vector Error Correction models where there are many possible restrictions on the cointegration space. We show how this allows the researcher to begin with a single unrestricted model and either do model selection or model averaging in an automatic and computationally efficient manner. We apply our methods to a large UK macroeconomic model.
Resumo:
This paper develops stochastic search variable selection (SSVS) for zero-inflated count models which are commonly used in health economics. This allows for either model averaging or model selection in situations with many potential regressors. The proposed techniques are applied to a data set from Germany considering the demand for health care. A package for the free statistical software environment R is provided.
Resumo:
We consider a general equilibrium model a la Bhaskar (Review of Economic Studies 2002): there are complementarities across sectors, each of which comprise (many) heterogenous monopolistically competitive firms. Bhaskar's model is extended in two directions: production requires capital, and labour markets are segmented. Labour market segmentation models the difficulties of labour migrating across international barriers (in a trade context) or from a poor region to a richer one (in a regional context), whilst the assumption of a single capital market means that capital flows freely between countries or regions. The model is solved analytically and a closed form solution is provided. Adding labour market segmentation to Bhaskar's two-tier industrial structure allows us to study, inter alia, the impact of competition regulations on wages and - financial flows both in the regional and international context, and the output, welfare and financial implications of relaxing immigration laws. The analytical approach adopted allows us, not only to sign the effect of policies, but also to quantify their effects. Introducing capital as a factor of production improves the realism of the model and refi nes its empirically testable implications.
Resumo:
The Scottish Parliament has the authority to make a balanced-budget expansion or contraction in public expenditure, funded by corresponding local changes in the basic rate of income tax of up to 3p in the pound. This fiscal adjustment is known as the Scottish Variable Rate of income tax, though it has never, as yet, been used. In this paper we attempt to identify the impact on aggregate economic activity in Scotland of implementing these devolved fiscal powers. This is achieved through theoretical analysis and simulation using a Computable General Equilibrium (CGE) model for Scotland. This analysis generalises the conventional Keynesian model so that negative balanced-budget multipliers values are possible, reflecting a regional “inverted Haavelmo effect”. Key parameters determining the aggregate economic impact are the extent to which the Scottish Government create local amenities valuable to the Scottish population and the extent to which this is incorporated into local wage bargaining.
Resumo:
This study examines the inter-industry wage structure of the organised manufacturing sector in India for the period 1973-74 to 2003-04 by estimating the growth of average real wages for production workers by industry. In order to estimate the growth rates, the study adopts a methodological framework that differs from other studies in that the time series properties of the concerned variables are closely considered in order to obtain meaningful estimates of growth that are unbiased and (asymptotically) efficient. Using wage data on 51 manufacturing industries at three digit level of the National Industrial Classification 1998 (India), our estimation procedure obtains estimates of growth of real wages per worker that are deterministic in nature by accounting for any potential structural break(s). Our findings show that the inter-industry wage structure in India has changed a lot in the period 1973-74 to 2003-04 and that it provides some evidence that the inter-industry wage differences have become more pronounced in the post-reforms period. Thus this paper provides new evidence from India on the need to consider the hypothesis that industry affiliation is potentially an important determinant of wages when studying any relationship between reforms and wages.
Resumo:
This study addresses the issue of the presence of a unit root on the growth rate estimation by the least-squares approach. We argue that when the log of a variable contains a unit root, i.e., it is not stationary then the growth rate estimate from the log-linear trend model is not a valid representation of the actual growth of the series. In fact, under such a situation, we show that the growth of the series is the cumulative impact of a stochastic process. As such the growth estimate from such a model is just a spurious representation of the actual growth of the series, which we refer to as a “pseudo growth rate”. Hence such an estimate should be interpreted with caution. On the other hand, we highlight that the statistical representation of a series as containing a unit root is not easy to separate from an alternative description which represents the series as fundamentally deterministic (no unit root) but containing a structural break. In search of a way around this, our study presents a survey of both the theoretical and empirical literature on unit root tests that takes into account possible structural breaks. We show that when a series is trendstationary with breaks, it is possible to use the log-linear trend model to obtain well defined estimates of growth rates for sub-periods which are valid representations of the actual growth of the series. Finally, to highlight the above issues, we carry out an empirical application whereby we estimate meaningful growth rates of real wages per worker for 51 industries from the organised manufacturing sector in India for the period 1973-2003, which are not only unbiased but also asymptotically efficient. We use these growth rate estimates to highlight the evolving inter-industry wage structure in India.
Resumo:
Spatial econometrics has been criticized by some economists because some model specifications have been driven by data-analytic considerations rather than having a firm foundation in economic theory. In particular this applies to the so-called W matrix, which is integral to the structure of endogenous and exogenous spatial lags, and to spatial error processes, and which are almost the sine qua non of spatial econometrics. Moreover it has been suggested that the significance of a spatially lagged dependent variable involving W may be misleading, since it may be simply picking up the effects of omitted spatially dependent variables, incorrectly suggesting the existence of a spillover mechanism. In this paper we review the theoretical and empirical rationale for network dependence and spatial externalities as embodied in spatially lagged variables, arguing that failing to acknowledge their presence at least leads to biased inference, can be a cause of inconsistent estimation, and leads to an incorrect understanding of true causal processes.
Resumo:
This paper considers the instrumental variable regression model when there is uncertainty about the set of instruments, exogeneity restrictions, the validity of identifying restrictions and the set of exogenous regressors. This uncertainty can result in a huge number of models. To avoid statistical problems associated with standard model selection procedures, we develop a reversible jump Markov chain Monte Carlo algorithm that allows us to do Bayesian model averaging. The algorithm is very exible and can be easily adapted to analyze any of the di¤erent priors that have been proposed in the Bayesian instrumental variables literature. We show how to calculate the probability of any relevant restriction (e.g. the posterior probability that over-identifying restrictions hold) and discuss diagnostic checking using the posterior distribution of discrepancy vectors. We illustrate our methods in a returns-to-schooling application.
Resumo:
We establish a Quillen model structure on simplicial(symmetric) multicategories. It extends the model structure on simplicial categories due to J. Bergner [2]. We observe that our technique of proof enables us to prove a similar result for (symmetric) multicategories enriched over other monoidal model categories than simplicial sets. Examples include small categories, simplicial abelian groups and compactly generated Hausdorff spaces.
Resumo:
We construct a cofibrantly generated Thomason model structure on the category of small n-fold categories and prove that it is Quillen equivalent to the standard model structure on the category of simplicial sets. An n-fold functor is a weak equivalence if and only if the diagonal of its n-fold nerve is a weak equivalence of simplicial sets. We introduce an n-fold Grothendieck construction for multisimplicial sets, and prove that it is a homotopy inverse to the n-fold nerve. As a consequence, the unit and counit of the adjunction between simplicial sets and n-fold categories are natural weak equivalences.