919 resultados para Evaluaciones ex-post
Resumo:
Quantitative easing à la ECB has produced so far an impact on long-term nominal rates through ex ante channels: signalling channels, term duration channels, and risk premia channels. The term duration channel will also lead to a lengthening of the average maturity of government debts, with possible implications for fiscal policy. The ECB’s determination to buy government bonds in a fragmented market with a low net supply may also produce an ex post impact, during the actual asset purchases, but less on nominal rates and more on financial plumbing, as recent volatility suggests. As the effects of scarce supply in collateral markets are felt, repo rates remain well below zero. Lower supply and limited re-usability of high quality collateral, capped by regulatory requirements, is a constraint on market liquidity and compresses dealers’ balance sheets. By keeping a depressed yield curve and asset prices high, QE may also accelerate the consolidation of both traditional and capital-market based (dealer) bank business models. What is less clear is how these changing business models will interact with the sharp rise of the asset management industry in the aftermath of the crisis, which raises questions about the implications for global collateral flows and deposit-like funding channels.
Resumo:
This study provides an ex-post evaluation of the EU copyright framework as provided by EU Directive 29/2001 on Copyright in the Information Society (InfoSoc Directive) and related legislation, focusing on four key criteria: effectiveness, efficiency, coherence and relevance. The evaluation finds that the EU copyright framework scores poorly on all four accounts. Of the four main goals pursued by the InfoSoc, only the alignment with international legislation can be said to have been fully achieved. The wider framework on copyright still generates costs by inhibiting content production, distribution and creation and generating productive, allocative and dynamic inefficiencies. Several problems also remain in terms of both internal and external coherence. Finally, espite its overall importance and relevance as a domain of legislation in the fields of content and media, the EU copyright framework is outdated in light of technological developments. Policy options to reform the current framework are provided in the CEPS companion study on the functioning and efficiency of the Digital Single Market in the field of copyright (CEPS Special Report No. 121/November 2015).
Resumo:
The Single Resolution Board (SRB) will be responsible for the resolution of banks in the euro area from 1 January 2016. However, the resources of the Single Resolution Fund (SRF) at the disposal of the SRB will only gradually be built up until 2023. This paper provides estimates of the potential financing needs of the SRF, based on the euro area bank resolutions that actually occurred between 2007 and 2014. We find that the SRF would have been asked to put a total amount of about €72 billion into these failing banks, which is more than the target for the SRF (€55 billion) but less than the amount the SRF could draw on, if the ex-post levies are also taken into account. As this sum would have been required over eight years, the broad conclusion is that bridge financing, in addition to the existing alternative funding, would only have been needed in the early years of the transition.
Resumo:
We model social choices as acts mapping states of the world to (social) outcomes. A (social choice) rule assigns an act to every profile of subjective expected utility preferences over acts. A rule is strategy-proof if no agent ever has an incentive to misrepresent her beliefs about the world or her valuation of the outcomes; it is ex-post efficient if the act selected at any given preference profile picks a Pareto-efficient outcome in every state of the world. We show that every two-agent ex-post efficient and strategy-proof rule is a top selection: the chosen act picks the most preferred outcome of some (possibly different) agent in every state of the world. The states in which an agent’s top outcome is selected cannot vary with the reported valuations of the outcomes but may change with the reported beliefs. We give a complete characterization of the ex-post efficient and strategy-proof rules in the two-agent, two-state case, and we identify a rich class of such rules in the two-agent case.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-06
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-06
Resumo:
Background: The purpose of the present study was to describe a profile of Australian paediatric occupational therapy practice in terms of theories, assessments and interventions used with the most frequently seen client groups. Methods: An ex post facto survey design was utilised. A purpose-designed survey was mailed to 600 occupational therapists identified by OT Australia as working in paediatrics. Results: The response rate was 55% (n = 330). Respondents in the sample worked chiefly with children with developmental delays, learning disabilities, neurological impairments, and infants/toddlers. Theoretical models used by paediatric clinicians that were common to the most frequently seen client groups focused on sensory integration/multisensory approaches, occupational performance, and client-centred practice. Assessment tools most frequently used were the Test of Visual Motor Integration, Sensory Profile, Bruininks-Oseretsky Test of Motor Proficiency, Handwriting Speed Test, and Motor-Free Visual Perception Test. The most often used treatment methods across the four most frequently seen client groups were parent/caregiver education, sensory integration/stimulation techniques, and managing activities of daily living. Conclusions: Paediatric occupational therapists appeared to draw on a range of theoretical models. With the exception of the Sensory Profile, the assessment and treatment methods most frequently used are not congruent with the most commonly used theoretical models. It is critical that the assessment and treatment methods used are conceptually consistent with the theoretical models that guide practice. Occupational therapists need to examine the evidence and determine whether their clinical practice is grounded in the best contemporary theoretical models, assessments and interventions.
Resumo:
Comparisons were made of the paediatric content of professional entry-level occupational therapy university program curricula in Australia, New Zealand, and Canada using an ex post facto surveymethodology. The findings indicated that in Australia/New Zealand, paediatrics made up 20% of the total curriculum, but only 13% in Canada. Canadian reference materials were utilized less often in Canadian universities than in Australia/New Zealand. Theories taught most often in Australia/New Zealand were: Sensory Integration, Neurodevelopmental Therapy, Client-Centered Practice, Playfulness, and the Model of Human Occupation. In Canada, the most frequent theories were: Piaget’s Stages ofCognitive/Intellectual Development, Neurodevelopmental Therapy, Erikson’s Eight Stages of Psychosocial Development and Sensory Integration. The most frequently taught paediatric assessment tools in both regions were the Bruininks-Oseretsky Test of Motor Proficiency and Miller Assessment for Preschoolers. Paediatric interventionmethods taught to students in all three countries focused on activities of daily living/self-care, motor skills, perceptual and visual motor integration, and infant and child development. [Article copies available for a fee from The Haworth Document Delivery Service: 1-800-HAWORTH. E-mail address: Website: ©2006 by The Haworth Press, Inc. All rights reserved.]
Resumo:
Corporate restructuring is perceived as a challenge to research. Prior studies do not provide conclusive evidence regarding the effects of restructuring. Since there are discernible findings, this research attempts to examine the effects of restructuring events amongst the UK listed firms. The sample firms are listed in the LSE and London AIM stock exchange. Only completed restructuring transactions are included in the study. The time horizon extends from year 1999 to 2003. A three-year floating window is assigned to examine the sample firms. The key enquiry is to scrutinise the ex post effects of restructuring on performance and value measures of firms with contrast to a matched criteria non-restructured sample. A cross sectional study employing logit estimate is undertaken to examine firm characteristics of restructuring samples. Further, additional parameters, i.e. Conditional Volatility and Asymmetry are generated under the GJR-GARCH estimate and reiterated in logit models to capture time-varying heteroscedasticity of the samples. This research incorporates most forms of restructurings, while prior studies have examined certain forms of restructuring. Particularly, these studies have made limited attempts to examine different restructuring events simultaneously. In addition to logit analysis, an event study is adopted to evaluate the announcement effect of restructuring under both the OLS and GJR-GARCH estimate supplementing our prior results. By engaging a composite empirical framework, our estimation method validates a full appreciation of restructuring effect. The study provides evidence that restructurings indicate non-trivial significant positive effect. There are some evidences that the response differs because of the types of restructuring, particularly while event study is applied. The results establish that performance measures, i.e. Operating Profit Margin, Return on Equity, Return on Assets, Growth, Size, Profit Margin and Shareholders' Ownership indicate consistent and significant increase. However, Leverage and Asset Turn Over suggest reasonable influence on restructuring across the sample period. Similarly, value measures, i.e. Abnormal Returns, Return on Equity and Cash Flow Margin suggest sizeable improvement. A notable characteristic seen coherently throughout the analysis is the decreasing proportion of Systematic Risk. Consistent with these findings, Conditional Volatility and Asymmetry exhibit similar trend. The event study analysis suggests that on an average market perceives restructuring favourably and shareholders experience significant and systematic positive gain.
Resumo:
The aim of this research is to improve the planning methodology of Dunlop via an analysis of their annual planning system. This was approached via an investigation of how the plans were developed; extensive interviews, which analysed divisional attitudes and approaches to planning; an analysis of forecast accuracy; and participation in the planning system itself. These investigations revealed certain deficiencies in the operating of the system. In particular, little evidence of formal planning could be found, and some divisions were reacting ex post to the market, rather than planning ex ante. The resulting plans tended to lack resilience and were generally unrealistic, partly because of imposed targets. Similarly, because the links between the elements of the system were often inefficient, previously agreed strategies were not always implemented. The analysis of forecast accuracy in the plans revealed divisions to be poor at most aspects of forecasting. Simple naive models often outperformed divisional forecasts, and much of the error was attributed to systematic, and therefore eliminable factors. These analyses suggested the need for a new system which is proposed in the form of Budgetary Planning. This system involves conceptual changes within the current planning framework. Such changes aim to revise tactical planning in order to meet the needs placed on it by. in particular, strategic planning. Budgetary Planning is an innovation in terms of the current planning literature. It is a total system of annual planning aimed at implementing and controlling the iteratively agreed strategies within the current environment. This is achieved by the generation of tactical alternatives, variable funding and concentration of forecast credibility, all of which aid both the realism and the resilience of planning.
Resumo:
Governance theories, such as transaction cost economics, argue that systematic deviations from an attribute–governance alignment should influence performance. This article investigates the performance implications of contract specificity for the procurement of information technology products. The authors argue that parties choose a level of contract specificity that economizes on both the ex ante contracting costs and the ex post transaction costs and that deviations between the observed and the predicted levels of contract specificity are an important determinant of these transaction costs. The authors test the hypotheses using a comprehensive archival data set of information technology transactions and employ a two-step estimation procedure. First, they estimate the “predicted” level of contract specificity, which accounts for key transactional attributes. Second, they study the consequences of deviating from this predicted level of contractual specificity. The results provide the first explicit demonstration of the trade-off between ex ante contracting costs and ex post transaction problems and suggest that parties need to economize jointly on these costs when choosing the governance form.
Resumo:
This paper proposes a conceptual model for a firm's capability to calibrate supply chain knowledge (CCK). Knowledge calibration is achieved when there is a match between managers' ex ante confidence in the accuracy of held knowledge and the ex post accuracy of that knowledge. Knowledge calibration is closely related to knowledge utility or willingness to use the available ex ante knowledge: a manager uses the ex ante knowledge if he/she is confident in the accuracy of that knowledge, and does not use it or uses it with reservation, when the confidence is low. Thus, knowledge calibration attained through the firm's CCK enables managers to deal with incomplete and uncertain information and enhances quality of decisions. In the supply chain context, although demand- and supply-related knowledge is available, supply chain inefficiencies, such as the bullwhip effect, remain. These issues may be caused not by a lack of knowledge but by a firm's lack of capability to sense potential disagreement between knowledge accuracy and confidence. Therefore, this paper contributes to the understanding of supply chain knowledge utilization by defining CCK and identifying a set of antecedents and consequences of CCK in the supply chain context.
Resumo:
Significant growth in mobile media consumption has prompted a call to better understand the socio-cultural and policy dimensions of consumer choices. Contrary to industry and technology led analysis, this study argues that to guide consumer choice and innovation via regulatory policies requires an understanding of both ex-ante as well as in ex-post consumption conditions. This study examines mobile phone gaming to uncover how consumer anti-choice shapes decision-making as a framework for closely interrogating the ways in which policy concerns impact on consumers' behavior. Through eleven focus groups (n=62), the study empirically identifies voluntary, intentional, and positive consumer anti-choice behaviors all of which impact policy initiatives when consumers, both gamers and non-gamers, self-regulate their behaviors. Findings point to four types of policy implication: regulating the self-regulated, understanding anti-choice, boundary-setting and including the self-excluded. © 2012 Elsevier Ltd.
Resumo:
Using the case of a low cost airline company’s website we analyze some special research questions of information technology valuation. The distinctive characteristics of this research are the ex post valuation perspective; the parallel and comparative use of accounting and business valuation approaches; and the integrated application of discounted cash flow and real option valuation. As the examined international company is a strategic user of e-technology and wants to manage and account intangible IT-assets explicitly, these specific valuation perspectives are gaining practical significance.
Resumo:
In the context of discrete districting problems with geographical constraints, we demonstrate that determining an (ex post) unbiased districting, which requires that the number of representatives of a party should be proportional to its share of votes, turns out to be a computationally intractable (NP-complete) problem. This raises doubts as to whether an independent jury will be able to come up with a “fair” redistricting plan in case of a large population, that is, there is no guarantee for finding an unbiased districting (even if such exists). We also show that, in the absence of geographical constraints, an unbiased districting can be implemented by a simple alternating-move game among the two parties.