879 resultados para Panel Data Model
Resumo:
This paper analyzes whether differences in institutional structures on capital markets contribute to explaining why some DECO-countries, in particular the Anglo-Saxon countries, have been much more successful over the last two decades in producing employment growth and in reducing unemployment than most continental-European DECO-countries. It is argued that the often-blamed labor market rigidities alone, while important, do not provide a satisfactory explanation for these differences across countries and over time. Financial constraints are potentially important obstacles against creating new firms and jobs and thus against coping well with structural change and against moving successfully toward the "new economy". Highly developed venture capital markets should help to alleviate such financial constraints. This view that labor-market institutions should be supplemented by capital market imperfections for explaining differences in employment performances is supported by our panel data analysis, in which venture capital turns out to be a significant institutional variable.
Resumo:
The work of Russell Dalton has undoubtedly played a seminal role in the study of the relation between political sophistication and partisan dealignment. We furthermore acknowledge the presence of a consensus on the occurrence of lower levels of partisanship in Germany. Using panel data as well as pooled cross-sectional observations, however, it is clear that generational replacement is not the sole driving force of partisan dealignment, but that period effects should also be taken into account. While on an aggregate level rising levels of political sophistication have occurred simultaneously with decreasing partisanship, individual level analysis suggests clearly that the least sophisticated are most likely to feel alienated from the party system. We close with some very specific suggestion on how to address the democratic consequences of declining levels of partisanship.
Resumo:
Party identification traditionally is seen as an important linkage mechanism, connecting voters to the party system. Previous analyses have suggested that the level of party identity is in decline in Germany, and in this article, we first expand previous observations with more recent data. These suggest that the erosion of party identity continues up to the present time. An age-period-cohort analysis of the panel data of the SOEP panel suggests that period effects are significantly negative. Furthermore, it can be observed that throughout the 1992-2009 observation period, education level and political interest have become more important determinants of party identity. Contrary to some of the literature, therefore, it can be shown that the loss of party identity is concentrated among groups with lower levels of political sophistication, indicating that the socio-economic profile of the group with a sense of party identification has become more distinct compared to the population as a whole. In the discussion, we investigate the theoretical and democratic consequences of this trend.
Resumo:
In this manuscript we describe the experimental procedure employed at the Alfred Wegener Institute in Germany in the preparation of the simulations for the Pliocene Model Intercomparison Project (PlioMIP). We present a description of the utilized Community Earth System Models (COSMOS, version: COSMOS-landveg r2413, 2009) and document the procedures that we applied to transfer the Pliocene Research, Interpretation and Synoptic Mapping (PRISM) Project mid-Pliocene reconstruction into model forcing fields. The model setup and spin-up procedure are described for both the paleo- and preindustrial (PI) time slices of PlioMIP experiments 1 and 2, and general results that depict the performance of our model setup for mid-Pliocene conditions are presented. The mid-Pliocene, as simulated with our COSMOS setup and PRISM boundary conditions, is both warmer and wetter in the global mean than the PI. The globally averaged annual mean surface air temperature in the mid-Pliocene standalone atmosphere (fully coupled atmosphere-ocean) simulation is 17.35 °C (17.82 °C), which implies a warming of 2.23 °C (3.40 °C) relative to the respective PI control simulation.
Resumo:
Esse trabalho investiga empiricamente a relação entre custo de agência e as medidas de monitoramento interno disponíveis aos investidores brasileiros nas empresas nacionais, utilizando amostras de companhias abertas entre os anos de 2010 e 2014, totalizando 134 empresas analisadas e 536 observações. Para medir tal relação, foram utilizadas, como variáveis de monitoramento interno, informações sobre a remuneração variável dos executivos, entre elas o uso de outorgas de opções de compra de ações, a composição do conselho de administração, dando ênfase à representatividade dos conselheiros independentes e à dualidade entre Chairman e CEO, e o percentual do capital social das companhias que está sob propriedade dos executivos. Como proxy para custo de agência, foram utilizados os indicadores Asset Turnover Ratio e General & Administrative Expenses (G&A) como percentual da Receita Líquida. Neste contexto, foram estabelecidas duas hipóteses de pesquisa e estimados modelos de regressão em painel controlados por efeitos fixos de tempo e empresa, empregando como variável dependente as variáveis proxy do custo de agência e utilizando as variáveis endividamento e tamanho das empresas como variáveis de controle. Os resultados dos modelos demonstram que, na amostra selecionada, há uma relação positiva e significativa entre o percentual da remuneração variável e as proxies de custo de agência, comportamento este contrário ao esperado originalmente. Conclui-se assim que as empresas que apresentam uma maior composição variável no total remunerado ao executivo, incorrem em um maior custo de agência, o que leva à conclusão de que tais ferramentas não são boas estratégias de alinhamento de interesses entre executivos e acionistas. As demais variáveis de monitoramento interno não apresentaram significância.
Resumo:
The paper presents a computational system based upon formal principles to run spatial models for environmental processes. The simulator is named SimuMap because it is typically used to simulate spatial processes over a mapped representation of terrain. A model is formally represented in SimuMap as a set of coupled sub-models. The paper considers the situation where spatial processes operate at different time levels, but are still integrated. An example of such a situation commonly occurs in watershed hydrology where overland flow and stream channel flow have very different flow rates but are highly related as they are subject to the same terrain runoff processes. SimuMap is able to run a network of sub-models that express different time-space derivatives for water flow processes. Sub-models may be coded generically with a map algebra programming language that uses a surface data model. To address the problem of differing time levels in simulation, the paper: (i) reviews general approaches for numerical solvers, (ii) considers the constraints that need to be enforced to use more adaptive time steps in discrete time specified simulations, and (iii) scaling transfer rates in equations that use different time bases for time-space derivatives. A multistep scheme is proposed for SimuMap. This is presented along with a description of its visual programming interface, its modelling formalisms and future plans. (C) 2003 Elsevier Ltd. All rights reserved.
Resumo:
Risk-ranking protocols are used widely to classify the conservation status of the world's species. Here we report on the first empirical assessment of their reliability by using a retrospective study of 18 pairs of bird and mammal species (one species extinct and the other extant) with eight different assessors. The performance of individual assessors varied substantially, but performance was improved by incorporating uncertainty in parameter estimates and consensus among the assessors. When this was done, the ranks from the protocols were consistent with the extinction outcome in 70-80% of pairs and there were mismatches in only 10-20% of cases. This performance was similar to the subjective judgements of the assessors after they had estimated the range and population parameters required by the protocols, and better than any single parameter. When used to inform subjective judgement, the protocols therefore offer a means of reducing unpredictable biases that may be associated with expert input and have the advantage of making the logic behind assessments explicit. We conclude that the protocols are useful for forecasting extinctions, although they are prone to some errors that have implications for conservation. Some level of error is to be expected, however, given the influence of chance on extinction. The performance of risk assessment protocols may be improved by providing training in the application of the protocols, incorporating uncertainty in parameter estimates and using consensus among multiple assessors, including some who are experts in the application of the protocols. Continued testing and refinement of the protocols may help to provide better absolute estimates of risk, particularly by re-evaluating how the protocols accommodate missing data.
Resumo:
This paper examines the causal links between productivity growth and two price series given by domestic inflation and the price of mineral products in Australia's mining sector for the period 1968/1969 to 1997/1998. The study also uses a stochastic translog cost frontier to generate improved estimates of total factor productivity (TFP) growth. The results indicate negative unidirectional causality running from both price series to mining productivity growth. Regression analysis further shows that domestic inflation has a small but adverse effect on mining productivity growth, thus providing some empirical support for Australia's 'inflation first' monetary policy, at least with respect to the mining sector. Inflation in mineral price, on the other hand, has a greater negative effect on mining productivity growth via mineral export growth.
Resumo:
Optimal intertemporal investment behaviour of Australian pastoralists is modelled using panel data for the period 1979-1993. Results indicate that quasi-fixity of inputs of labour, capital, sheep numbers and cattle numbers is characteristic of production in the pastoral region. It takes about two years for labour, four years for capital and a little over two years for both sheep numbers and cattle numbers to adjust towards long-run optimal levels. Results also indicate that, after accounting for adjustment costs, own-price product supply and input demand responses are inelastic in both the short and long run.
Resumo:
Chambers and Quiggin (2000) use state-contingent representations of risky production technologies to establish important theoretical results concerning producer behavior under uncertainty. Unfortunately, perceived problems in the estimation of state-contingent models have limited the usefulness of the approach in policy formulation. We show that fixed and random effects state-contingent production frontiers can be conveniently estimated in a finite mixtures framework. An empirical example is provided. Compared to conventional estimation approaches, we find that estimating production frontiers in a state-contingent framework produces significantly different estimates of elasticities, firm technical efficiencies, and other quantities of economic interest.
Resumo:
Integrating information in the molecular biosciences involves more than the cross-referencing of sequences or structures. Experimental protocols, results of computational analyses, annotations and links to relevant literature form integral parts of this information, and impart meaning to sequence or structure. In this review, we examine some existing approaches to integrating information in the molecular biosciences. We consider not only technical issues concerning the integration of heterogeneous data sources and the corresponding semantic implications, but also the integration of analytical results. Within the broad range of strategies for integration of data and information, we distinguish between platforms and developments. We discuss two current platforms and six current developments, and identify what we believe to be their strengths and limitations. We identify key unsolved problems in integrating information in the molecular biosciences, and discuss possible strategies for addressing them including semantic integration using ontologies, XML as a data model, and graphical user interfaces as integrative environments.
Resumo:
Land-surface processes include a broad class of models that operate at a landscape scale. Current modelling approaches tend to be specialised towards one type of process, yet it is the interaction of processes that is increasing seen as important to obtain a more integrated approach to land management. This paper presents a technique and a tool that may be applied generically to landscape processes. The technique tracks moving interfaces across landscapes for processes such as water flow, biochemical diffusion, and plant dispersal. Its theoretical development applies a Lagrangian approach to motion over a Eulerian grid space by tracking quantities across a landscape as an evolving front. An algorithm for this technique, called level set method, is implemented in a geographical information system (GIS). It fits with a field data model in GIS and is implemented as operators in map algebra. The paper describes an implementation of the level set methods in a map algebra programming language, called MapScript, and gives example program scripts for applications in ecology and hydrology.
Resumo:
Our extensive research has indicated that high-school teachers are reluctant to make use of existing instructional educational software (Pollard, 2005). Even software developed in a partnership between a teacher and a software engineer is unlikely to be adopted by teachers outside the partnership (Pollard, 2005). In this paper we address these issues directly by adopting a reusable architectural design for instructional educational software which allows easy customisation of software to meet the specific needs of individual teachers. By doing this we will facilitate more teachers regularly using instructional technology within their classrooms. Our domain-specific software architecture, Interface-Activities-Model, was designed specifically to facilitate individual customisation by redefining and restructuring what constitutes an object so that they can be readily reused or extended as required. The key to this architecture is the way in which the software is broken into small generic encapsulated components with minimal domain specific behaviour. The domain specific behaviour is decoupled from the interface and encapsulated in objects which relate to the instructional material through tasks and activities. The domain model is also broken into two distinct models - Application State Model and Domainspecific Data Model. This decoupling and distribution of control gives the software designer enormous flexibility in modifying components without affecting other sections of the design. This paper sets the context of this architecture, describes it in detail, and applies it to an actual application developed to teach high-school mathematical concepts.