849 resultados para Forward looking
Resumo:
Over the past seventeen years Canada has decentralized many social programmes, moving responsibility from the federal government to 13 provinces and territories through bilateral federal-provincial agreements. In contrast, the European Union (EU) has moved in the opposite direction, building pan-European approaches and establishing new processes to facilitate multilateral collaboration among the 28 EU member states. This has been done through a new governance approach called the Open Method of Coordination (OMC). Using a detailed case study − employment policy − this paper explores whether Canada could learn from OMC governance ideas to re-build a pan-Canadian dimension to employment policy and improve the performance of its intergovernmental relations system. Concrete lessons for Canada to improve decentralized governance are suggested: consolidating the different bilateral agreements; using benchmarking instead of controls in fiscal transfers; undertaking research, analysis, and comparisons in order to facilitate mutual learning; revitalizing intergovernmental structures in light of devolution; and engaging social partners, civil society and other stakeholders. Post-devolution Canada is not doing badly in managing employment policy, but could do better. Looking to the EU for ideas on new ways to collaborate provides a chance for setting a forward looking agenda that could ultimately result not only in better labour market outcomes, but also improvements to one small part of Canada’s often fractious federation.
Resumo:
This paper empirically investigates the extent to which the European Central Bank has responded to evolving economic conditions in its member states as opposed to the euro area as a whole. Based on a forward-looking Taylor rule-type policy reaction function, we conduct counterfactual exercises that compare the monetary policy behavior of the ECB with two alternative hypothetical scenarios: (1) were the euro member states to make individual policy decisions, and (2) were the ECB to respond to the economic conditions of individual members. The results reflect the extent of heterogeneity among the national economies in the monetary union and indicate that the ECB's monetary policy rates have been particularly close to the "counterfactual" interest rates of its largest euro members, as well as of countries with similar economic conditions, which includes Germany, Austria, Belgium and France.
Resumo:
With the European Parliament’s July report on the revision of the European Neighbourhood Policy (ENP) calling on the Commission to ‘go back to basics’, this article argues that such a move would be counter-productive and that instead, the ENP needs to move towards the future and break away with the historical elusiveness of this overarching policy. The Riga Summit serves as an illustration of what will not suffice if the EU is to strengthen or even maintain its role in its neighbourhoods. The revision of the ENP is described as 3Dimensional process which needs to yield a concrete and forward-looking new ENP. The recommendations put forth herein map out what a truly revised ENP would entail.
Resumo:
This Policy Brief argues that the newly adopted EU temporary relocation (quota) system constitutes a welcome yet timid step forward in addressing a number of central controversies of the current refugee debate in Europe. Two main challenges affect the effective operability of the new EU relocation model. First, EU member states’ asylum systems show profound (on-the-ground) weaknesses in reception conditions and judicial/administrative capacities. These prevent a fair and humane processing of asylum applications. EU states are not implementing the common standards enshrined in the EU reception conditions Directive 2013/33. Second, the new relocation system constitutes a move away from the much-criticised Dublin system, but it is still anchored to its premises. The Dublin system is driven by an unfair and unsustainable rule according to which the first EU state of entry is responsible for assessing asylum applications. It does not properly consider the personal, private and family circumstances or the preferences of asylum-seekers. Policy Recommendations In order to respond to these challenges, the Policy Brief offers the following policy recommendations: The EU should strengthen and better enforce member states’ reception capacities, abolish the current Dublin system rule of allocation of responsibility and expand the new relocation distribution criteria to include in the assessment (as far as possible) asylum-seekers’ preferences and personal/family links to EU member states. EU member countries should give priority to boosting their current and forward-looking administrative and judicial capacities to deal and welcome asylum applications. The EU should establish a permanent common European border and asylum service focused on ensuring the highest standards through stable operational support, institutional solidarity across all EU external borders and the practical implementation of new distribution relocation criteria.
Resumo:
Although much has been written about Mary Shelley's Frankenstein, the part played by Erasmus Darwin (1731-1802) has been almost entirely neglected. This is odd as, apart from some ghost stories, Dr Darwin is the one influence mentioned in both the 1816 and 1831 prefaces to the book. The present contribution aims to redress that omission. It aims to show that Darwin's ideas about spontaneous generation, his anti-establishment ideas, and his literary genius played a significant role in forming the 'dark and shapeless substance' surging in Mary Shelley's mind during the summer of 1816 and from which her tale of Gothic horror emerged. It is, however, ultimately ironic that Frankenstein, which warns against a too enthusiastic use of scientific knowledge, should have been partly inspired by one of the most optimistically forward-looking of all late eighteenth-century thinkers. © 2007 Institute of Materials, Minerals and Mining.
Resumo:
The central argument to this thesis is that the nature and purpose of corporate reporting has changed over time to become a more outward looking and forward looking document designed to promote the company and its performance to a wide range of shareholders, rather than merely to report to its owners upon past performance. it is argued that the discourse of environmental accounting and reporting is one driver for this change but that this discourse has been set up as in conflicting with the discourse of traditional accounting and performance measurement. The effect of this opposition between the discourses is that the two have been interpreted to be different and incompatible dimensions of performance with good performance along one dimension only being achievable through a sacrifice of performance along the other dimension. Thus a perceived dialectic in performance is believed to exist. One of the principal purposes of this thesis is to explore this perceived dialectic and, through analysis, to show that it does not exist and that there is not incompatibility. This exploration and analysis is based upon an investigation of the inherent inconsistencies in such corporate reports and the analysis makes use of both a statistical analysis and a semiotic analysis of corporate reports and the reported performance of companies along these dimensions. Thus the development of a semiology of corporate reporting is one of the significant outcomes of this thesis. A further outcome is a consideration of the implications of the analysis for corporate performance and its measurement. The thesis concludes with a consideration of the way in which the advent of electronic reporting may affect the ability of organisations to maintain the dialectic and the implications for corporate reporting.
Resumo:
This thesis explores how the world-wide-web can be used to support English language teachers doing further studies at a distance. The future of education worldwide is moving towards a requirement that we, as teacher educators, use the latest web technology not as a gambit, but as a viable tool to improve learning. By examining the literature on knowledge, teacher education and web training, a model of teacher knowledge development, along with statements of advice for web developers based upon the model are developed. Next, the applicability and viability of both the model and statements of advice are examined by developing a teacher support site (bttp://www. philseflsupport. com) according to these principles. The data collected from one focus group of users from sixteen different countries, all studying on the same distance Masters programme, is then analysed in depth. The outcomes from the research are threefold: A functioning website that is averaging around 15, 000 hits a month provides a professional contribution. An expanded model of teacher knowledge development that is based upon five theoretical principles that reflect the ever-expanding cyclical nature of teacher learning provides an academic contribution. A series of six statements of advice for developers of teacher support sites. These statements are grounded in the theoretical principles behind the model of teacher knowledge development and incorporate nine keys to effective web facilitation. Taken together, they provide a forward-looking contribution to the praxis of web supported teacher education, and thus to the potential dissemination of the research presented here. The research has succeeded in reducing the proliferation of terminology in teacher knowledge into a succinct model of teacher knowledge development. The model may now be used to further our understanding of how teachers learn and develop as other research builds upon the individual study here. NB: Appendix 4 is only available only available for consultation at Aston University Library with prior arrangement.
Resumo:
Mobile technology has been one of the major growth areas in computing over recent years (Urbaczewski, Valacich, & Jessup, 2003). Mobile devices are becoming increasingly diverse and are continuing to shrink in size and weight. Although this increases the portability of such devices, their usability tends to suffer. Fuelled almost entirely by lack of usability, users report high levels of frustration regarding interaction with mobile technologies (Venkatesh, Ramesh, & Massey, 2003). This will only worsen if interaction design for mobile technologies does not continue to receive increasing research attention. For the commercial benefit of mobility and mobile commerce (m-commerce) to be fully realized, users’ interaction experiences with mobile technology cannot be negative. To ensure this, it is imperative that we design the right types of mobile interaction (m-interaction); an important prerequisite for this is ensuring that users’ experience meets both their sensory and functional needs (Venkatesh, Ramesh, & Massey, 2003). Given the resource disparity between mobile and desktop technologies, successful electronic commerce (e-commerce) interface design and evaluation does not necessarily equate to successful m-commerce design and evaluation. It is, therefore, imperative that the specific needs of m-commerce are addressed–both in terms of design and evaluation. This chapter begins by exploring the complexities of designing interaction for mobile technology, highlighting the effect of context on the use of such technology. It then goes on to discuss how interaction design for mobile devices might evolve, introducing alternative interaction modalities that are likely to affect that future evolution. It is impossible, within a single chapter, to consider each and every potential mechanism for interacting with mobile technologies; to provide a forward-looking flavor of what might be possible, this chapter focuses on some more novel methods of interaction and does not, therefore, look at the typical keyboard and visual display-based interaction which, in essence, stem from the desktop interaction design paradigm. Finally, this chapter touches on issues associated with effective evaluation of m-interaction and mobile application designs. By highlighting some of the issues and possibilities for novel m-interaction design and evaluation, we hope that future designers will be encouraged to “think out of the box” in terms of their designs and evaluation strategies.
Resumo:
The predictive accuracy of competing crude-oil price forecast densities is investigated for the 1994–2006 period. Moving beyond standard ARCH type models that rely exclusively on past returns, we examine the benefits of utilizing the forward-looking information that is embedded in the prices of derivative contracts. Risk-neutral densities, obtained from panels of crude-oil option prices, are adjusted to reflect real-world risks using either a parametric or a non-parametric calibration approach. The relative performance of the models is evaluated for the entire support of the density, as well as for regions and intervals that are of special interest for the economic agent. We find that non-parametric adjustments of risk-neutral density forecasts perform significantly better than their parametric counterparts. Goodness-of-fit tests and out-of-sample likelihood comparisons favor forecast densities obtained by option prices and non-parametric calibration methods over those constructed using historical returns and simulated ARCH processes. © 2010 Wiley Periodicals, Inc. Jrl Fut Mark 31:727–754, 2011
Resumo:
Models for the conditional joint distribution of the U.S. Dollar/Japanese Yen and Euro/Japanese Yen exchange rates, from November 2001 until June 2007, are evaluated and compared. The conditional dependency is allowed to vary across time, as a function of either historical returns or a combination of past return data and option-implied dependence estimates. Using prices of currency options that are available in the public domain, risk-neutral dependency expectations are extracted through a copula repre- sentation of the bivariate risk-neutral density. For this purpose, we employ either the one-parameter \Normal" or a two-parameter \Gumbel Mixture" specification. The latter provides forward-looking information regarding the overall degree of covariation, as well as, the level and direction of asymmetric dependence. Specifications that include option-based measures in their information set are found to outperform, in-sample and out-of-sample, models that rely solely on historical returns.
Resumo:
Bankruptcy prediction has been a fruitful area of research. Univariate analysis and discriminant analysis were the first methodologies used. While they perform relatively well at correctly classifying bankrupt and nonbankrupt firms, their predictive ability has come into question over time. Univariate analysis lacks the big picture that financial distress entails. Multivariate discriminant analysis requires stringent assumptions that are violated when dealing with accounting ratios and market variables. This has led to the use of more complex models such as neural networks. While the accuracy of the predictions has improved with the use of more technical models, there is still an important point missing. Accounting ratios are the usual discriminating variables used in bankruptcy prediction. However, accounting ratios are backward-looking variables. At best, they are a current snapshot of the firm. Market variables are forward-looking variables. They are determined by discounting future outcomes. Microstructure variables, such as the bid-ask spread, also contain important information. Insiders are privy to more information that the retail investor, so if any financial distress is looming, the insiders should know before the general public. Therefore, any model in bankruptcy prediction should include market and microstructure variables. That is the focus of this dissertation. The traditional models and the newer, more technical models were tested and compared to the previous literature by employing accounting ratios, market variables, and microstructure variables. Our findings suggest that the more technical models are preferable, and that a mix of accounting and market variables are best at correctly classifying and predicting bankrupt firms. Multi-layer perceptron appears to be the most accurate model following the results. The set of best discriminating variables includes price, standard deviation of price, the bid-ask spread, net income to sale, working capital to total assets, and current liabilities to total assets.
Resumo:
Urban problems have several features that make them inherently dynamic. Large transaction costs all but guarantee that homeowners will do their best to consider how a neighborhood might change before buying a house. Similarly, stores face large sunk costs when opening, and want to be sure that their investment will pay off in the long run. In line with those concerns, different areas of Economics have made recent advances in modeling those questions within a dynamic framework. This dissertation contributes to those efforts.
Chapter 2 discusses how to model an agent’s location decision when the agent must learn about an exogenous amenity that may be changing over time. The model is applied to estimating the marginal willingness to pay to avoid crime, in which agents are learning about the crime rate in a neighborhood, and the crime rate can change in predictable (Markovian) ways.
Chapters 3 and 4 concentrate on location decision problems when there are externalities between decision makers. Chapter 3 focuses on the decision of business owners to open a store, when its demand is a function of other nearby stores, either through competition, or through spillovers on foot traffic. It uses a dynamic model in continuous time to model agents’ decisions. A particular challenge is isolating the contribution of spillovers from the contribution of other unobserved neighborhood attributes that could also lead to agglomeration. A key contribution of this chapter is showing how we can use information on storefront ownership to help separately identify spillovers.
Finally, chapter 4 focuses on a class of models in which families prefer to live
close to similar neighbors. This chapter provides the first simulation of such a model in which agents are forward looking, and shows that this leads to more segregation than it would have been observed with myopic agents, which is the standard in this literature. The chapter also discusses several extensions of the model that can be used to investigate relevant questions such as the arrival of a large contingent high skilled tech workers in San Francisco, the immigration of hispanic families to several southern American cities, large changes in local amenities, such as the construction of magnet schools or metro stations, and the flight of wealthy residents from cities in the Rust belt, such as Detroit.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-08
Resumo:
While fossil energy dependency has declined and energy supply has grown in the postwar world economy, future resource scarcity could cast its shadow on world economic growth soon if energy markets are forward looking. We develop an endogenous growth model that reconciles the current aggregate trends in energy use and productivity growth with the intertemporal dynamics of forward looking resource markets. Combining scarcity-rent driven energy supply (in the spirit of Hotelling) with profit-driven Directed Technical Change (in the spirit of Romer/Acemoglu), we generate transitional dynamics that can be qualitatively calibrated to current trends. The long-run properties of the model are studied to examine whether current trends are sustainable. We highlight the role of extraction costs in mining.
Resumo:
Dissertação (mestrado)—Universidade de Brasília, Faculdade de Direito, Programa de Pós-Graduação em Direito, 2016.