892 resultados para Financial market data


Relevância:

100.00% 100.00%

Publicador:

Resumo:

DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Typical Double Auction (DA) models assume that trading agents are one-way traders. With this limitation, they cannot directly reflect the fact individual traders in financial markets (the most popular application of double auction) choose their trading directions dynamically. To address this issue, we introduce the Bi-directional Double Auction (BDA) market which is populated by two-way traders. Based on experiments under both static and dynamic settings, we find that the allocative efficiency of a static continuous BDA market comes from rational selection of trading directions and is negatively related to the intelligence of trading strategies. Moreover, we introduce Kernel trading strategy designed based on probability density estimation for general DA market. Our experiments show it outperforms some intelligent DA market trading strategies. Copyright © 2013, International Foundation for Autonomous Agents and Multiagent Systems (www.ifaamas.org). All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

O trabalho busca comparar dois conjuntos de informações para a projeção das variações do PIB brasileiro: através de modelos econométricos aplicados sobre a série univariada do PIB, e a aplicação dos mesmos modelos, mas contemplando adicionalmente o conjunto de informação com dados da estrutura a termo de taxa de juros de swap PRÉ-DI. O objetivo é verificar, assim como descrito na literatura internacional, se informações de variáveis financeiras tem a capacidade de incrementar o poder preditivo de projeções de variáveis macroeconômicas, na medida em que esses dados também embutem as expectativas dos agentes em relação ao desenvolvimento do cenário econômico. Adicionalmente, o mesmo procedimento aplicado para os dados brasileiros é aplicado sobre as informações dos Estados Unidos, buscando poder fornecer ao estudo uma base de comparação sobre os dados, tamanho da amostra e estágio de maturidade das respectivas economias. Como conclusão do resultado do trabalho está o fato de que foi possível obter um modelo no qual a inclusão do componente de mercado apresenta menores erros de projeção do que as projeções apenas univariadas, no entanto, os ganhos de projeção não demonstram grande vantagem comparativa a ponto de poder capturar o efeito de antecipação do mercado em relação ao indicador econômico como em alguns casos norte-americanos. Adicionalmente o estudo demonstra que para este trabalho e amostra de dados, mesmo diante de diferentes modelos econométricos de previsão, as projeções univariadas apresentaram resultados similares.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The period from 2007 to 2009 covered the residential property boom from early 2000, to the property recession following the Global Financial Crisis. Since late 2008, a number of residential property markets have suffered significant falls in house prices, buth this has not been consistent across all market sectors. This paper will analyze the housing market in Brisbane Australia to determine the impact, similarities and differences that the4 GFC had on range of residential sectors across a divesified property market. Data analysis will provide an overview of residential property prices, sales and listing volumes over the study period and will provide a comparison of median house price performance across the geographic and socio-economic areas of Brisbane.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper develops a framework to test whether discrete-valued irregularly-spaced financial transactions data follow a subordinated Markov process. For that purpose, we consider a specific optional sampling in which a continuous-time Markov process is observed only when it crosses some discrete level. This framework is convenient for it accommodates not only the irregular spacing of transactions data, but also price discreteness. Further, it turns out that, under such an observation rule, the current price duration is independent of previous price durations given the current price realization. A simple nonparametric test then follows by examining whether this conditional independence property holds. Finally, we investigate whether or not bid-ask spreads follow Markov processes using transactions data from the New York Stock Exchange. The motivation lies on the fact that asymmetric information models of market microstructures predict that the Markov property does not hold for the bid-ask spread. The results are mixed in the sense that the Markov assumption is rejected for three out of the five stocks we have analyzed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

International financial institutions have promoted financial regulatory transparency, or the publication by supervisors of financial industry data. Financial regulatory transparency enhances market stability and increases democratic legitimacy. • We introduce a new index of financial regulatory data transparency: the FRT Index. It measures how countries report to international financial institutions basic macroprudential data about their financial systems.The Index covers 68 high-income and emerging-market economies over 22 years (1990-2011). • We find a number of striking trends over this period. European Union members are generally more opaque than other high-income countries.This finding is especially relevant given efforts to create an EU capital markets union. • Globally, financial regulatory data transparency has increased. However, there is considerable variation. Some countries have become significantlymore transparent, while others have become much more opaque. Reporting tends to decline during financial crises. • We propose that the EU institutions take on a greater role in coordinating and possibly enforcing reporting of bank and non-bank institution data. Similar to the United States, a reporting requirement should be part of any EU general deposit insurance scheme.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Securities and Exchange Commission (SEC) in the United States and in particular its immediately past chairman, Christopher Cox, has been actively promoting an upgrade of the EDGAR system of disseminating filings. The new generation of information provision has been dubbed by Chairman Cox, "Interactive Data" (SEC, 2006). In October this year the Office of Interactive Disclosure was created(http://www.sec.gov/news/press/2007/2007-213.htm). The focus of this paper is to examine the way in which the non-professional investor has been constructed by various actors. We examine the manner in which Interactive Data has been sold as the panacea for financial market 'irregularities' by the SEC and others. The academic literature shows almost no evidence of researching non-professional investors in any real sense (Young, 2006). Both this literature and the behaviour of representatives of institutions such as the SEC and FSA appears to find it convenient to construct this class of investor in a particular form and to speak for them. We theorise the activities of the SEC and its chairman in particular over a period of about three years, both following and prior to the 'credit crunch'. Our approach is to examine a selection of the policy documents released by the SEC and other interested parties and the statements made by some of the policy makers and regulators central to the programme to advance the socio-technical project that is constituted by Interactive Data. We adopt insights from ANT and more particularly the sociology of translation (Callon, 1986; Latour, 1987, 2005; Law, 1996, 2002; Law & Singleton, 2005) to show how individuals and regulators have acted as spokespersons for this malleable class of investor. We theorise the processes of accountability to investors and others and in so doing reveal the regulatory bodies taking the regulated for granted. The possible implications of technological developments in digital reporting have been identified also by the CEO's of the six biggest audit firms in a discussion document on the role of accounting information and audit in the future of global capital markets (DiPiazza et al., 2006). The potential for digital reporting enabled through XBRL to "revolutionize the entire company reporting model" (p.16) is discussed and they conclude that the new model "should be driven by the wants of investors and other users of company information,..." (p.17; emphasis in the original). Here rather than examine the somewhat illusive and vexing question of whether adding interactive functionality to 'traditional' reports can achieve the benefits claimed for nonprofessional investors we wish to consider the rhetorical and discursive moves in which the SEC and others have engaged to present such developments as providing clearer reporting and accountability standards and serving the interests of this constructed and largely unknown group - the non-professional investor.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper deals with the economics of gasification facilities in general and IGCC power plants in particular. Regarding the prospects of these systems, passing the technological test is one thing, passing the economic test can be quite another. In this respect, traditional valuations assume constant input and/or output prices. Since this is hardly realistic, we allow for uncertainty in prices. We naturally look at the markets where many of the products involved are regularly traded. Futures markets on commodities are particularly useful for valuing uncertain future cash flows. Thus, revenues and variable costs can be assessed by means of sound financial concepts and actual market data. On the other hand, these complex systems provide a number of flexibility options (e.g., to choose among several inputs, outputs, modes of operation, etc.). Typically, flexibility contributes significantly to the overall value of real assets. Indeed, maximization of the asset value requires the optimal exercise of any flexibility option available. Yet the economic value of flexibility is elusive, the more so under (price) uncertainty. And the right choice of input fuels and/or output products is a main concern for the facility managers. As a particular application, we deal with the valuation of input flexibility. We follow the Real Options approach. In addition to economic variables, we also address technical and environmental issues such as energy efficiency, utility performance characteristics and emissions (note that carbon constraints are looming). Lastly, a brief introduction to some stochastic processes suitable for valuation purposes is provided.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a mathematically rigorous Quality-of-Service (QoS) metric which relates the achievable quality of service metric (QoS) for a real-time analytics service to the server energy cost of offering the service. Using a new iso-QoS evaluation methodology, we scale server resources to meet QoS targets and directly rank the servers in terms of their energy-efficiency and by extension cost of ownership. Our metric and method are platform-independent and enable fair comparison of datacenter compute servers with significant architectural diversity, including micro-servers. We deploy our metric and methodology to compare three servers running financial option pricing workloads on real-life market data. We find that server ranking is sensitive to data inputs and desired QoS level and that although scale-out micro-servers can be up to two times more energy-efficient than conventional heavyweight servers for the same target QoS, they are still six times less energy efficient than high-performance computational accelerators.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Marginal Expected Shortfall (MES) is an approach used to measure the systemic risk financial institutions face. It estimates how significantly systemic events (poor market performance, out of 1.6 times Standard Deviation borders) are expected to affect market capitalization of a particular firm. The concept was developed in the late 2000s and is widely used for cross-country comparisons of financial firms. For the purposes of generalization of this technique it is often used with market data containing non-domestic currencies for some financial firms. That may lead to results having currency noise in them as it is shown for 77 UK financial firms in our analysis between 2001 and 2014.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The research problem selected for this study is one of the important issues in the field of financial market and its marketing dimensions on which researchers and academicians encourage more research studies. This research study may be relevant considering its significance in terms of some possible findings which may be useful to Fls in framing successful market segmentation approach to turn their dissatisfied and ‘merely' satisfied customers into ‘delighted’ customers, which in turn can result in better savings mobilisation. The household segments may also be benefited from the research findings if they bring about an attitudinal change in their savings behaviour. The importance of the study may be briefly highlighted in the following points. The research study examines existing theories on market segmentation by Fls and the findings might supplement the existing theories on this topic. The study brings to light certain clues to strengthen market segmentation approach of Fls.The study throws light on the existing beliefs and perceptions on customer behaviour which may be useful in effecting some positive changes in market segmentation approach by Fls. The study suggests certain relationship between market segmentation variables and customer behaviour in the context of marketing of financial products by Fls. The study supplements the existing knowledge on different dimension of market segmentation in the financial market which might encourage future research in the field.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We propose and estimate a financial distress model that explicitly accounts for the interactions or spill-over effects between financial institutions, through the use of a spatial continuity matrix that is build from financial network data of inter bank transactions. Such setup of the financial distress model allows for the empirical validation of the importance of network externalities in determining financial distress, in addition to institution specific and macroeconomic covariates. The relevance of such specification is that it incorporates simultaneously micro-prudential factors (Basel 2) as well as macro-prudential and systemic factors (Basel 3) as determinants of financial distress. Results indicate network externalities are an important determinant of financial health of a financial institutions. The parameter that measures the effect of network externalities is both economically and statistical significant and its inclusion as a risk factor reduces the importance of the firm specific variables such as the size or degree of leverage of the financial institution. In addition we analyze the policy implications of the network factor model for capital requirements and deposit insurance pricing.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this work we compared the estimates of the parameters of ARCH models using a complete Bayesian method and an empirical Bayesian method in which we adopted a non-informative prior distribution and informative prior distribution, respectively. We also considered a reparameterization of those models in order to map the space of the parameters into real space. This procedure permits choosing prior normal distributions for the transformed parameters. The posterior summaries were obtained using Monte Carlo Markov chain methods (MCMC). The methodology was evaluated by considering the Telebras series from the Brazilian financial market. The results show that the two methods are able to adjust ARCH models with different numbers of parameters. The empirical Bayesian method provided a more parsimonious model to the data and better adjustment than the complete Bayesian method.