880 resultados para technical trading rules
Resumo:
The financial crisis of 2007-2009 and the subsequent reaction of the G20 have created a new global regulatory landscape. Within the EU, change of regulatory institutions is ongoing. The research objective of this study is to understand how institutional changes to the EU regulatory landscape may affect corresponding institutionalized operational practices within financial organizations and to understand the role of agency within this process. Our motivation is to provide insight into these changes from an operational management perspective, as well as to test Thelen and Mahoney?s (2010) modes of institutional change. Consequently, the study researched implementations of an Investment Management System with a rules-based compliance module within financial organizations. The research consulted compliance and risk managers, as well as systems experts. The study suggests that prescriptive regulations are likely to create isomorphic configurations of rules-based compliance systems, which consequently will enable the institutionalization of associated compliance practices. The study reveals the ability of some agents within financial organizations to control the impact of regulatory institutions, not directly, but through the systems and processes they adopt to meet requirements. Furthermore, the research highlights the boundaries and relationships between each mode of change as future avenues of research.
Resumo:
Prism is a modular classification rule generation method based on the ‘separate and conquer’ approach that is alternative to the rule induction approach using decision trees also known as ‘divide and conquer’. Prism often achieves a similar level of classification accuracy compared with decision trees, but tends to produce a more compact noise tolerant set of classification rules. As with other classification rule generation methods, a principle problem arising with Prism is that of overfitting due to over-specialised rules. In addition, over-specialised rules increase the associated computational complexity. These problems can be solved by pruning methods. For the Prism method, two pruning algorithms have been introduced recently for reducing overfitting of classification rules - J-pruning and Jmax-pruning. Both algorithms are based on the J-measure, an information theoretic means for quantifying the theoretical information content of a rule. Jmax-pruning attempts to exploit the J-measure to its full potential because J-pruning does not actually achieve this and may even lead to underfitting. A series of experiments have proved that Jmax-pruning may outperform J-pruning in reducing overfitting. However, Jmax-pruning is computationally relatively expensive and may also lead to underfitting. This paper reviews the Prism method and the two existing pruning algorithms above. It also proposes a novel pruning algorithm called Jmid-pruning. The latter is based on the J-measure and it reduces overfitting to a similar level as the other two algorithms but is better in avoiding underfitting and unnecessary computational effort. The authors conduct an experimental study on the performance of the Jmid-pruning algorithm in terms of classification accuracy and computational efficiency. The algorithm is also evaluated comparatively with the J-pruning and Jmax-pruning algorithms.
Resumo:
Despite an extensive market segmentation literature, applied academic studies which bridge segmentation theory and practice remain a priority for researchers. The need for studies which examine the segmentation implementation barriers faced by organisations is particularly acute. We explore segmentation implementation through the eyes of a European utilities business, by following its progress through a major segmentation project. The study reveals the character and impact of implementation barriers occurring at different stages in the segmentation process. By classifying the barriers, we develop implementation "rules" for practitioners which are designed to minimise their occurrence and impact. We further contribute to the literature by developing a deeper understanding of the mechanisms through which these implementation rules can be applied.
Resumo:
The purpose of this paper is to explore how companies that hold carbon trading accounts under European Union Emissions Trading Scheme (EU ETS) respond to the climate change by using disclosures on carbon emissions as a means to generate legitimacy compared to others. The study is based on disclosures made in annual reports and stand-alone sustainability reports of UK listed companies from 2001- 2012. The study uses content analysis to capture both the quality and volume of the carbon disclosures. The results show that there is a significant increase in both the quality and volume of the carbon disclosures after the launch of EU ETS. Companies with carbon trading accounts provide greater detailed disclosures as compared to the others without an account. We also find that company size is positively correlated with the disclosures while the association with the industry produces an inconclusive result.
Resumo:
In 2007 futures contracts were introduced based upon the listed real estate market in Europe. Following their launch they have received increasing attention from property investors, however, few studies have considered the impact their introduction has had. This study considers two key elements. Firstly, a traditional Generalized Autoregressive Conditional Heteroskedasticity (GARCH) model, the approach of Bessembinder & Seguin (1992) and the Gray’s (1996) Markov-switching-GARCH model are used to examine the impact of futures trading on the European real estate securities market. The results show that futures trading did not destabilize the underlying listed market. Importantly, the results also reveal that the introduction of a futures market has improved the speed and quality of information flowing to the spot market. Secondly, we assess the hedging effectiveness of the contracts using two alternative strategies (naïve and Ordinary Least Squares models). The empirical results also show that the contracts are effective hedging instruments, leading to a reduction in risk of 64 %.
Resumo:
Tremendous progress in plant proteomics driven by mass spectrometry (MS) techniques has been made since 2000 when few proteomics reports were published and plant proteomics was in its infancy. These achievements include the refinement of existing techniques and the search for new techniques to address food security, safety, and health issues. It is projected that in 2050, the world’s population will reach 9–12 billion people demanding a food production increase of 34–70% (FAO, 2009) from today’s food production. Provision of food in a sustainable and environmentally committed manner for such a demand without threatening natural resources, requires that agricultural production increases significantly and that postharvest handling and food manufacturing systems become more efficient requiring lower energy expenditure, a decrease in postharvest losses, less waste generation and food with longer shelf life. There is also a need to look for alternative protein sources to animal based (i.e., plant based) to be able to fulfill the increase in protein demands by 2050. Thus, plant biology has a critical role to play as a science capable of addressing such challenges. In this review, we discuss proteomics especially MS, as a platform, being utilized in plant biology research for the past 10 years having the potential to expedite the process of understanding plant biology for human benefits. The increasing application of proteomics technologies in food security, analysis, and safety is emphasized in this review. But, we are aware that no unique approach/technology is capable to address the global food issues. Proteomics-generated information/resources must be integrated and correlated with other omics-based approaches, information, and conventional programs to ensure sufficient food and resources for human development now and in the future.
Resumo:
Research has highlighted the usefulness of the Gilt–Equity Yield Ratio (GEYR) as a predictor of UK stock returns. This paper extends recent studies by endogenising the threshold at which the GEYR switches from being low to being high or vice versa, thus improving the arbitrary nature of the determination of the threshold employed in the extant literature. It is observed that a decision rule for investing in equities or bonds, based on the forecasts from a regime switching model, yields higher average returns with lower variability than a static portfolio containing any combinations of equities and bonds. A closer inspection of the results reveals that the model has power to forecast when investors should steer clear of equities, although the trading profits generated are insufficient to outweigh the associated transaction costs.
Resumo:
This paper examines the lead–lag relationship between the FTSE 100 index and index futures price employing a number of time series models. Using 10-min observations from June 1996–1997, it is found that lagged changes in the futures price can help to predict changes in the spot price. The best forecasting model is of the error correction type, allowing for the theoretical difference between spot and futures prices according to the cost of carry relationship. This predictive ability is in turn utilised to derive a trading strategy which is tested under real-world conditions to search for systematic profitable trading opportunities. It is revealed that although the model forecasts produce significantly higher returns than a passive benchmark, the model was unable to outperform the benchmark after allowing for transaction costs.
Resumo:
A glance along the finance shelves at any bookshop reveals a large number of books that seek to show readers how to ‘make a million’ or ‘beat the market’ with allegedly highly profitable equity trading strategies. This paper investigates whether useful trading strategies can be derived from popular books of investment strategy, with What Works on Wall Street by James P. O'Shaughnessy used as an example. Specifically, we test whether this strategy would have produced a similarly spectacular performance in the UK context as was demonstrated by the author for the US market. As part of our investigation, we highlight a general methodology for determining whether the observed superior performance of a trading rule could be attributed in part or in entirety to data mining. Overall, we find that the O'Shaughnessy rule performs reasonably well in the UK equity market, yielding higher returns than the FTSE All-Share Index, but lower returns than an equally weighted benchmark
Resumo:
This note describes a simple procedure for removing unphysical temporal discontinuities in ERA-Interim upper stratospheric global mean temperatures in March 1985 and August 1998 that have arisen due to changes in satellite radiance data used in the assimilation. The derived temperature adjustments (offsets) are suitable for use in stratosphere-resolving chemistry-climate models that are nudged (relaxed) to ERA-Interim winds and temperatures. Simulations using a nudged version of the Canadian Middle Atmosphere Model (CMAM) show that the inclusion of the temperature adjustments produces temperature time series that are devoid of the large jumps in 1985 and 1998. Due to its strong temperature dependence, the simulated upper stratospheric ozone is also shown to vary smoothly in time, unlike in a nudged simulation without the adjustments where abrupt changes in ozone occur at the times of the temperature jumps. While the adjustments to the ERA-Interim temperatures remove significant artefacts in the nudged CMAM simulation, spurious transient effects that arise due to water vapour and persist for about 5 yr after the 1979 switch to ERA-Interim data are identified, underlining the need for caution when analysing trends in runs nudged to reanalyses.
Resumo:
In this paper we propose methods for computing Fresnel integrals based on truncated trapezium rule approximations to integrals on the real line, these trapezium rules modified to take into account poles of the integrand near the real axis. Our starting point is a method for computation of the error function of complex argument due to Matta and Reichel (J Math Phys 34:298–307, 1956) and Hunter and Regan (Math Comp 26:539–541, 1972). We construct approximations which we prove are exponentially convergent as a function of N , the number of quadrature points, obtaining explicit error bounds which show that accuracies of 10−15 uniformly on the real line are achieved with N=12 , this confirmed by computations. The approximations we obtain are attractive, additionally, in that they maintain small relative errors for small and large argument, are analytic on the real axis (echoing the analyticity of the Fresnel integrals), and are straightforward to implement.