811 resultados para crowdfunding,equity-based crowdfunding,financial forecasting


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Conventional economic theory, applied to information released by listed companies, equates ‘useful’ with ‘price-sensitive’. Stock exchange rules accordingly prohibit the selec- tive, private communication of price-sensitive information. Yet, even in the absence of such communication, UK equity fund managers routinely meet privately with the senior execu- tives of the companies in which they invest. Moreover, they consider these brief, formal and formulaic meetings to be their most important sources of investment information. In this paper we ask how that can be. Drawing on interview and observation data with fund managers and CFOs, we find evidence for three, non-mutually exclusive explanations: that the characterisation of information in conventional economic theory is too restricted, that fund managers fail to act with the rationality that conventional economic theory assumes, and/or that the primary value of the meetings for fund managers is not related to their investment decision making but to the claims of superior knowledge made to clients in marketing their active fund management expertise. Our findings suggest a disconnect between economic theory and economic policy based on that theory, as well as a corre- sponding limitation in research studies that test information-usefulness by assuming it to be synonymous with price-sensitivity. We draw implications for further research into the role of tacit knowledge in equity investment decision-making, and also into the effects of the principal–agent relationship between fund managers and their clients.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Corpus-assisted analyses of public discourse often focus on the lexical level. This article argues in favour of corpus-assisted analyses of discourse, but also in favour of conceptualising salient lexical items in public discourse in a more determined way. It draws partly on non-Anglophone academic traditions in order to promote a conceptualisation of discourse keywords, thereby highlighting how their meaning is determined by their use in discourse contexts. It also argues in favour of emphasising the cognitive and epistemic dimensions of discourse-determined semantic structures. These points will be exemplified by means of a corpus-assisted, as well as a frame-based analysis of the discourse keyword financial crisis in British newspaper articles from 2009. Collocations of financial crisis are assigned to a generic matrix frame for ‘event’ which contains slots that specify possible statements about events. By looking at which slots are more, respectively less filled with collocates of financial crisis, we will trace semantic presence as well as absence, and thereby highlight the pragmatic dimensions of lexical semantics in public discourse. The article also advocates the suitability of discourse keyword analyses for systematic contrastive analyses of public/political discourse and for lexicographical projects that could serve to extend the insights drawn from corpus-guided approaches to discourse analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Satellite-based Synthetic Aperture Radar (SAR) has proved useful for obtaining information on flood extent, which, when intersected with a Digital Elevation Model (DEM) of the floodplain, provides water level observations that can be assimilated into a hydrodynamic model to decrease forecast uncertainty. With an increasing number of operational satellites with SAR capability, information on the relationship between satellite first visit and revisit times and forecast performance is required to optimise the operational scheduling of satellite imagery. By using an Ensemble Transform Kalman Filter (ETKF) and a synthetic analysis with the 2D hydrodynamic model LISFLOOD-FP based on a real flooding case affecting an urban area (summer 2007,Tewkesbury, Southwest UK), we evaluate the sensitivity of the forecast performance to visit parameters. We emulate a generic hydrologic-hydrodynamic modelling cascade by imposing a bias and spatiotemporal correlations to the inflow error ensemble into the hydrodynamic domain. First, in agreement with previous research, estimation and correction for this bias leads to a clear improvement in keeping the forecast on track. Second, imagery obtained early in the flood is shown to have a large influence on forecast statistics. Revisit interval is most influential for early observations. The results are promising for the future of remote sensing-based water level observations for real-time flood forecasting in complex scenarios.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper models the determinants of integration in the context of global real estate security markets. Using both local and U.S. Dollar denominated returns, we model conditional correlations across listed real estate sectors and also with the global stock market. The empirical results find that financial factors, such as the relationship with the respective equity market, volatility, the relative size of the real estate sector and trading turnover all play an important role in the degree of integration present. Furthermore, the results highlight the importance of macro-economic variables in the degree of integration present. All four of the macro-economic variables modeled provide at least one significant result across the specifications estimated. Factors such as financial and trade openness, monetary independence and the stability of a country’s currency all contribute to the degree of integration reported.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Clinical pathway is an approach to standardise care processes to support the implementations of clinical guidelines and protocols. It is designed to support the management of treatment processes including clinical and non-clinical activities, resources and also financial aspects. It provides detailed guidance for each stage in the management of a patient with the aim of improving the continuity and coordination of care across different disciplines and sectors. However, in the practical treatment process, the lack of knowledge sharing and information accuracy of paper-based clinical pathways burden health-care staff with a large amount of paper work. This will often result in medical errors, inefficient treatment process and thus poor quality medical services. This paper first presents a theoretical underpinning and a co-design research methodology for integrated pathway management by drawing input from organisational semiotics. An approach to integrated clinical pathway management is then proposed, which aims to embed pathway knowledge into treatment processes and existing hospital information systems. The capability of this approach has been demonstrated through the case study in one of the largest hospitals in China. The outcome reveals that medical quality can be improved significantly by the classified clinical pathway knowledge and seamless integration with hospital information systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present an efficient graph-based algorithm for quantifying the similarity of household-level energy use profiles, using a notion of similarity that allows for small time–shifts when comparing profiles. Experimental results on a real smart meter data set demonstrate that in cases of practical interest our technique is far faster than the existing method for computing the same similarity measure. Having a fast algorithm for measuring profile similarity improves the efficiency of tasks such as clustering of customers and cross-validation of forecasting methods using historical data. Furthermore, we apply a generalisation of our algorithm to produce substantially better household-level energy use forecasts from historical smart meter data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider forecasting with factors, variables and both, modeling in-sample using Autometrics so all principal components and variables can be included jointly, while tackling multiple breaks by impulse-indicator saturation. A forecast-error taxonomy for factor models highlights the impacts of location shifts on forecast-error biases. Forecasting US GDP over 1-, 4- and 8-step horizons using the dataset from Stock and Watson (2009) updated to 2011:2 shows factor models are more useful for nowcasting or short-term forecasting, but their relative performance declines as the forecast horizon increases. Forecasts for GDP levels highlight the need for robust strategies, such as intercept corrections or differencing, when location shifts occur as in the recent financial crisis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper seeks to discuss EU policies relating to securities markets, created in the wake of the financial crisis and how ICT and specifically e-Government can be utilised within this context. This study utilises the UK as a basis for our discussion. The recent financial crisis has caused a change of perspective in relation to government services and polices. The regulation of the financial sector has been heavily criticised and so is undergoing radical change in the UK and the rest of Europe. New regulatory bodies are being defined with more focus on taking a risk-based system-wide approach to regulating the financial sector. This approach aims to prevent financial institutions becoming too big to fail and thus require massive government bail outs. In addition, a new wave of EU regulation is in the wind to update risk management practices and to further protect investors. This paper discusses the reasons for the financial crisis and the UK’s past and future regulatory landscape. The current and future approach and strategies adopted by the UK’s financial regulators are reviewed as is the lifecycle of EU Directives. The regulatory responses to the crisis are discussed and upcoming regulatory hotspots identified. Discussion of these issues provides the context for our evaluation of the role e-Government and ICT in improving the regulatory system. We identify several processes, which are elementary for regulatory compliance and discuss how ICT is elementary in their implementation. The processes considered include those required for internal control and monitoring, risk management, record keeping and disclosure to regulatory bodies. We find these processes offer an excellent opportunity to adopt an e-Government approach to improve services to both regulated businesses and individual investors through the benefits derived from a more effective and efficient regulatory system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The financial crisis of 2007-2009 and the subsequent reaction of the G20 have created a new global regulatory landscape. Within the EU, change of regulatory institutions is ongoing. The research objective of this study is to understand how institutional changes to the EU regulatory landscape may affect corresponding institutionalized operational practices within financial organizations and to understand the role of agency within this process. Our motivation is to provide insight into these changes from an operational management perspective, as well as to test Thelen and Mahoney?s (2010) modes of institutional change. Consequently, the study researched implementations of an Investment Management System with a rules-based compliance module within financial organizations. The research consulted compliance and risk managers, as well as systems experts. The study suggests that prescriptive regulations are likely to create isomorphic configurations of rules-based compliance systems, which consequently will enable the institutionalization of associated compliance practices. The study reveals the ability of some agents within financial organizations to control the impact of regulatory institutions, not directly, but through the systems and processes they adopt to meet requirements. Furthermore, the research highlights the boundaries and relationships between each mode of change as future avenues of research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Construction procurement is complex and there is a very wide range of options available to procurers. Inappropriate choices about how to procure may limit practical opportunities for innovation. In particular, traditional approaches to construction procurement set up many obstacles for technology suppliers to provide innovative solutions. This is because they are often employed as sub-contractors simply to provide and install equipment to specifications developed before the point at which they become involved in a project. A research team at the University of Reading has developed a procurement framework that comprehensively defines the various options open to procurers in a more fine-grained way than has been known in the past. This enables informed decisions that can establish tailor-made procurement approaches that take into account the needs of specific clients. It enables risk and reward structures to be aligned so that contracts and payment mechanisms are aligned precisely with what a client seeks to achieve. This is not a “one-size-fits-all” approach. Rather, it is an approach that enables informed decisions about how to organize individual procurements that are appropriate to particular circumstances, acknowledging that they differ for each client and for each procurement exercise. Within this context, performance-based contracting (PBC) is explored in terms of the different ways in which technology suppliers within constructed facilities might be encouraged and rewarded for the kinds of innovation sought by the ultimate clients. Examples from various industry sectors are presented, from public sector and from private sector, with a commentary about what they sought to achieve and the extent to which they were successful. The lessons from these examples are presented in terms of feasibility in relation to financial issues, governance, economics, strategic issues, contractual issues and cash flow issues for clients and for contractors. Further background documents and more detailed readings are provided in an appendix for those who wish to find out more.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The incorporation of numerical weather predictions (NWP) into a flood forecasting system can increase forecast lead times from a few hours to a few days. A single NWP forecast from a single forecast centre, however, is insufficient as it involves considerable non-predictable uncertainties and lead to a high number of false alarms. The availability of global ensemble numerical weather prediction systems through the THORPEX Interactive Grand Global Ensemble' (TIGGE) offers a new opportunity for flood forecast. The Grid-Xinanjiang distributed hydrological model, which is based on the Xinanjiang model theory and the topographical information of each grid cell extracted from the Digital Elevation Model (DEM), is coupled with ensemble weather predictions based on the TIGGE database (CMC, CMA, ECWMF, UKMO, NCEP) for flood forecast. This paper presents a case study using the coupled flood forecasting model on the Xixian catchment (a drainage area of 8826 km2) located in Henan province, China. A probabilistic discharge is provided as the end product of flood forecast. Results show that the association of the Grid-Xinanjiang model and the TIGGE database gives a promising tool for an early warning of flood events several days ahead.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Operational medium range flood forecasting systems are increasingly moving towards the adoption of ensembles of numerical weather predictions (NWP), known as ensemble prediction systems (EPS), to drive their predictions. We review the scientific drivers of this shift towards such ‘ensemble flood forecasting’ and discuss several of the questions surrounding best practice in using EPS in flood forecasting systems. We also review the literature evidence of the ‘added value’ of flood forecasts based on EPS and point to remaining key challenges in using EPS successfully.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A number of methods of evaluating the validity of interval forecasts of financial data are analysed, and illustrated using intraday FTSE100 index futures returns. Some existing interval forecast evaluation techniques, such as the Markov chain approach of Christoffersen (1998), are shown to be inappropriate in the presence of periodic heteroscedasticity. Instead, we consider a regression-based test, and a modified version of Christoffersen's Markov chain test for independence, and analyse their properties when the financial time series exhibit periodic volatility. These approaches lead to different conclusions when interval forecasts of FTSE100 index futures returns generated by various GARCH(1,1) and periodic GARCH(1,1) models are evaluated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Vintage-based vector autoregressive models of a single macroeconomic variable are shown to be a useful vehicle for obtaining forecasts of different maturities of future and past observations, including estimates of post-revision values. The forecasting performance of models which include information on annual revisions is superior to that of models which only include the first two data releases. However, the empirical results indicate that a model which reflects the seasonal nature of data releases more closely does not offer much improvement over an unrestricted vintage-based model which includes three rounds of annual revisions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article examines the ability of several models to generate optimal hedge ratios. Statistical models employed include univariate and multivariate generalized autoregressive conditionally heteroscedastic (GARCH) models, and exponentially weighted and simple moving averages. The variances of the hedged portfolios derived using these hedge ratios are compared with those based on market expectations implied by the prices of traded options. One-month and three-month hedging horizons are considered for four currency pairs. Overall, it has been found that an exponentially weighted moving-average model leads to lower portfolio variances than any of the GARCH-based, implied or time-invariant approaches.