122 resultados para USD


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Using a high-frequency data set of the spot Australian dollar/US dollar this study examines the distribution of quotes and returns across the 24 hour trading "day". Employing statistical methods for measuring long-tenn dependence in time-series we find evidence of time-varying dependence and volatility that aligns with the opening and closing of markets. This variation is attributed to the effects of liquidity and the price-discovery actions of dealers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Este trabalho tem por objetivo a precificação de opções de câmbio de EUR/BRL em função de seus componentes mais líquidos USD/BRL e EUR/USD. Para isso, adaptamos e calibramos um modelo de volatilidade estocástica a partir dos dados observados no mercado. Para compararmos os valores estimados pelo modelo com os negociados no mercado, simulamos estratégias de arbitragem entre os três pares envolvidos através dos dados históricos da cotação das moedas. O trabalho sugere que há um viés significativo no grau de curtose da distribuição dos retornos do par EUR/BRL implícita nos dados de mercado.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Este trabalho propõe um instrumento capaz de absorver choques no par BRL/USD, garantindo ao seu detentor a possibilidade de realizar a conversão entre essas moedas a uma taxa observada recentemente. O Volatility Triggered Range Forward assemelha-se a um instrumento forward comum, cujo preço de entrega não é conhecido inicialmente, mas definido no momento em que um nível de volatilidade pré-determinado for atingido na cotação das moedas ao longo da vida do instrumento. Seu cronograma de ajustes pode ser definido para um número qualquer de períodos. Seu apreçamento e controle de riscos é baseado em uma árvore trinomial ponderada entre dois possíveis regimes de volatilidade. Esses regimes são determinados após um estudo na série BRL/USD no período entre 2003 e 2009, basedo em um modelo Switching Autoregressive Conditional Heteroskedasticity (SWARCH).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We provide a comprehensive study of out-of-sample forecasts for the EUR/USD exchange rate based on multivariate macroeconomic models and forecast combinations. We use profit maximization measures based on directional accuracy and trading strategies in addition to standard loss minimization measures. When comparing predictive accuracy and profit measures, data snooping bias free tests are used. The results indicate that forecast combinations, in particular those based on principal components of forecasts, help to improve over benchmark trading strategies, although the excess return per unit of deviation is limited.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Sutchi catfish (Pangasianodon hypophthalmus) – known more universally by the Vietnamese name ‘Tra’ is an economically important freshwater fish in the Mekong Delta in Vietnam that constitutes an important food resource. Artificial propagation technology for Tra catfish has only recently been developed along the main branches of the Mekong River where more than 60% of the local human population participate in fishing or aquaculture. Extensive support for catfish culture in general, and that of Tra (P. hypophthalmus) in particular, has been provided by the Vietnamese government to increase both the scale of production and to develop international export markets. In 2006, total Vietnamese catfish exports reached approximately 286,602 metric tons (MT) and were valued at 736.87 $M with a number of large new export destinations being developed. Total value of production from catfish culture has been predicted to increase to approximately USD 1 billion by 2020. While freshwater catfish culture in Vietnam has a promising future, concerns have been raised about long-term quality of fry and the effectiveness of current brood stock management practices, issues that have been largely neglected to date. In this study, four DNA markers (microsatellite loci: CB4, CB7, CB12 and CB13) that were developed specifically for Tra (P. hypophthalmus) in an earlier study were applied to examine the genetic quality of artificially propagated Tra fry in the Mekong Delta in Vietnam. The goals of the study were to assess: (i) how well available levels of genetic variation in Tra brood stock used for artificial propagation in the Mekong Delta of Vietnam (breeders from three private hatcheries and Research Institute of Aquaculture No2 (RIA2) founders) has been conserved; and (ii) whether or not genetic diversity had declined significantly over time in a stock improvement program for Tra catfish at RIA2. A secondary issue addressed was how genetic markers could best be used to assist industry development. DNA was extracted from fins of catfish collected from the two main branches of the Mekong River inf Vietnam, three private hatcheries and samples from the Tra improvement program at RIA2. Study outcomes: i) Genetic diversity estimates for Tra brood stock samples were similar to, and slightly higher than, wild reference samples. In addition, the relative contribution by breeders to fry in commercial private hatcheries strongly suggest that the true Ne is likely to be significantly less than the breeder numbers used; ii) in a stock improvement program for Tra catfish at RIA2, no significant differences were detected in gene frequencies among generations (FST=0.021, P=0.036>0.002 after Bonferroni correction); and only small differences were observed in alleles frequencies among sample populations. To date, genetic markers have not been applied in the Tra catfish industry, but in the current project they were used to evaluate the levels of genetic variation in the Tra catfish selective breeding program at RIA2 and to undertake genetic correlations between genetic marker and trait variation. While no associations were detected using only four loci, they analysis provided training in the practical applications of the use of molecular markers in aquaculture in general, and in Tra culture, in particular.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Soil organic carbon sequestration rates over 20 years based on the Intergovernmental Panel for Climate Change (IPCC) methodology were combined with local economic data to determine the potential for soil C sequestration in wheat-based production systems on the Indo-Gangetic Plain (IGP). The C sequestration potential of rice–wheat systems of India on conversion to no-tillage is estimated to be 44.1 Mt C over 20 years. Implementing no-tillage practices in maize–wheat and cotton–wheat production systems would yield an additional 6.6 Mt C. This offset is equivalent to 9.6% of India's annual greenhouse gas emissions (519 Mt C) from all sectors (excluding land use change and forestry), or less than one percent per annum. The economic analysis was summarized as carbon supply curves expressing the total additional C accumulated over 20 year for a price per tonne of carbon sequestered ranging from zero to USD 200. At a carbon price of USD 25 Mg C−1, 3 Mt C (7% of the soil C sequestration potential) could be sequestered over 20 years through the implementation of no-till cropping practices in rice–wheat systems of the Indian States of the IGP, increasing to 7.3 Mt C (17% of the soil C sequestration potential) at USD 50 Mg C−1. Maximum levels of sequestration could be attained with carbon prices approaching USD 200 Mg C−1 for the States of Bihar and Punjab. At this carbon price, a total of 34.7 Mt C (79% of the estimated C sequestration potential) could be sequestered over 20 years across the rice–wheat region of India, with Uttar Pradesh contributing 13.9 Mt C.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Modernized GPS and GLONASS, together with new GNSS systems, BeiDou and Galileo, offer code and phase ranging signals in three or more carriers. Traditionally, dual-frequency code and/or phase GPS measurements are linearly combined to eliminate effects of ionosphere delays in various positioning and analysis. This typical treatment method has imitations in processing signals at three or more frequencies from more than one system and can be hardly adapted itself to cope with the booming of various receivers with a broad variety of singles. In this contribution, a generalized-positioning model that the navigation system independent and the carrier number unrelated is promoted, which is suitable for both single- and multi-sites data processing. For the synchronization of different signals, uncalibrated signal delays (USD) are more generally defined to compensate the signal specific offsets in code and phase signals respectively. In addition, the ionospheric delays are included in the parameterization with an elaborate consideration. Based on the analysis of the algebraic structures, this generalized-positioning model is further refined with a set of proper constrains to regularize the datum deficiency of the observation equation system. With this new model, uncalibrated signal delays (USD) and ionospheric delays are derived for both GPS and BeiDou with a large dada set. Numerical results demonstrate that, with a limited number of stations, the uncalibrated code delays (UCD) are determinate to a precision of about 0.1 ns for GPS and 0.4 ns for BeiDou signals, while the uncalibrated phase delays (UPD) for L1 and L2 are generated with 37 stations evenly distributed in China for GPS with a consistency of about 0.3 cycle. Extra experiments concerning the performance of this novel model in point positioning with mixed-frequencies of mixed-constellations is analyzed, in which the USD parameters are fixed with our generated values. The results are evaluated in terms of both positioning accuracy and convergence time.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The topic of “the cloud” has attracted significant attention throughout the past few years (Cherry 2009; Sterling and Stark 2009) and, as a result, academics and trade journals have created several competing definitions of “cloud computing” (e.g., Motahari-Nezhad et al. 2009). Underpinning this article is the definition put forward by the US National Institute of Standards and Technology, which describes cloud computing as “a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources that can be rapidly provisioned and released with minimal management effort or service provider interaction” (Garfinkel 2011, p. 3). Despite the lack of consensus about definitions, however, there is broad agreement on the growing demand for cloud computing. Some estimates suggest that spending on cloudrelated technologies and services in the next few years may climb as high as USD 42 billion/year (Buyya et al. 2009).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Ongoing innovation in digital animation and visual effects technologies has provided new opportunities for stories to be visually rendered in ways never before possible. Films featuring animation and visual effects continue to perform well at the box office, proving to be highly profitable projects. The Avengers (Whedon, 2012) holds the current record for opening weekend sales, accruing as much as $207,438,708 USD and $623,357,910 USD gross at time of writing. Life of Pi (Lee, 2012) at time of writing has grossed as much as $608,791,063 USD (Box Office Mojo, 2013). With so much creative potential and a demonstrable ability to generate a large amount of revenue, the animation and visual effects industry – otherwise known as the Post, Digital and Visual Effects (PDV) industry – has become significant to the future growth and stability of the Australian film industry as a whole.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Problem, research strategy and findings: On January 10, 2011, the town of Grantham, Queensland (Australia), was inundated with a flash flood in which 12 of the town's 370 residents drowned. The overall damage bill in Queensland was AUD∃2.38 billion (USD∃2.4 billion) with 35 deaths, and more than three-quarters of the state was declared a flood disaster zone. In this study, we focus on the unusual and even rare decision to relocate Grantham in March 2011. The Lockyer Valley Regional Council (LVRC) acquired a 377-hectare (932-acre) site to enable a voluntary swap of equivalent-sized lots. In addition, planning regulations were set aside to streamline the relocation of a portion of the town. We review the natural hazard literature as it relates to community relocation, state and local government documents related to Grantham, and reports and newspaper articles related to the flood. We also analyze data from interviews with key stakeholders. We document the process of community relocation, assess the relocation process in Grantham against best practice, examine whether the process of community relocation can be upscaled and if the Grantham relocation is an example of good planning or good politics. Takeaway for practice: Our study reveals two key messages for practice. Community relocation (albeit a small one) is possible, and the process can be done quickly; some Grantham residents moved into their new, relocated homes in December 2012, just 11 months after the flood. Moreover, the role of existing planning regulations can be a hindrance to quick action; political leadership, particularly at the local level, is key to implementing the relocation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The late twentieth century witnessed the transformation of the global economy beyond the fixed geographic boundaries of the nation-state system to one dominated by financial centers, global markets, and transnational firms. In the two decades to 2011, cross-border philanthropy from OECD Development Assistance Committee (DAC) donor countries to the developing world grew from approximately USD 5 billion to USD 32 billion (OECD, n.d.),[1] with some estimates for 2011 as high as USD 59 billion (Center for Global Prosperity, 2013). This is only part of cross-border philanthropy, which also includes remittances from migrant communities, social-media-enabled global fundraising, and medical research collaborations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Modeling and forecasting of implied volatility (IV) is important to both practitioners and academics, especially in trading, pricing, hedging, and risk management activities, all of which require an accurate volatility. However, it has become challenging since the 1987 stock market crash, as implied volatilities (IVs) recovered from stock index options present two patterns: volatility smirk(skew) and volatility term-structure, if the two are examined at the same time, presents a rich implied volatility surface (IVS). This implies that the assumptions behind the Black-Scholes (1973) model do not hold empirically, as asset prices are mostly influenced by many underlying risk factors. This thesis, consists of four essays, is modeling and forecasting implied volatility in the presence of options markets’ empirical regularities. The first essay is modeling the dynamics IVS, it extends the Dumas, Fleming and Whaley (DFW) (1998) framework; for instance, using moneyness in the implied forward price and OTM put-call options on the FTSE100 index, a nonlinear optimization is used to estimate different models and thereby produce rich, smooth IVSs. Here, the constant-volatility model fails to explain the variations in the rich IVS. Next, it is found that three factors can explain about 69-88% of the variance in the IVS. Of this, on average, 56% is explained by the level factor, 15% by the term-structure factor, and the additional 7% by the jump-fear factor. The second essay proposes a quantile regression model for modeling contemporaneous asymmetric return-volatility relationship, which is the generalization of Hibbert et al. (2008) model. The results show strong negative asymmetric return-volatility relationship at various quantiles of IV distributions, it is monotonically increasing when moving from the median quantile to the uppermost quantile (i.e., 95%); therefore, OLS underestimates this relationship at upper quantiles. Additionally, the asymmetric relationship is more pronounced with the smirk (skew) adjusted volatility index measure in comparison to the old volatility index measure. Nonetheless, the volatility indices are ranked in terms of asymmetric volatility as follows: VIX, VSTOXX, VDAX, and VXN. The third essay examines the information content of the new-VDAX volatility index to forecast daily Value-at-Risk (VaR) estimates and compares its VaR forecasts with the forecasts of the Filtered Historical Simulation and RiskMetrics. All daily VaR models are then backtested from 1992-2009 using unconditional, independence, conditional coverage, and quadratic-score tests. It is found that the VDAX subsumes almost all information required for the volatility of daily VaR forecasts for a portfolio of the DAX30 index; implied-VaR models outperform all other VaR models. The fourth essay models the risk factors driving the swaption IVs. It is found that three factors can explain 94-97% of the variation in each of the EUR, USD, and GBP swaption IVs. There are significant linkages across factors, and bi-directional causality is at work between the factors implied by EUR and USD swaption IVs. Furthermore, the factors implied by EUR and USD IVs respond to each others’ shocks; however, surprisingly, GBP does not affect them. Second, the string market model calibration results show it can efficiently reproduce (or forecast) the volatility surface for each of the swaptions markets.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The worldwide research in nanoelectronics is motivated by the fact that scaling of MOSFETs by conventional top down approach will not continue for ever due to fundamental limits imposed by physics even if it is delayed for some more years. The research community in this domain has largely become multidisciplinary trying to discover novel transistor structures built with novel materials so that semiconductor industry can continue to follow its projected roadmap. However, setting up and running a nanoelectronics facility for research is hugely expensive. Therefore it is a common model to setup a central networked facility that can be shared with large number of users across the research community. The Centres for Excellence in Nanoelectronics (CEN) at Indian Institute of Science, Bangalore (IISc) and Indian Institute of Technology, Bombay (IITB) are such central networked facilities setup with funding of about USD 20 million from the Department of Information Technology (DIT), Ministry of Communications and Information Technology (MCIT), Government of India, in 2005. Indian Nanoelectronics Users Program (INUP) is a missionary program not only to spread awareness and provide training in nanoelectronics but also to provide easy access to the latest facilities at CEN in IISc and at IITB for the wider nanoelectronics research community in India. This program, also funded by MCIT, aims to train researchers by conducting workshops, hands-on training programs, and providing access to CEN facilities. This is a unique program aiming to expedite nanoelectronics research in the country, as the funding for projects required for projects proposed by researchers from around India has prior financial approval from the government and requires only technical approval by the IISc/ IITB team. This paper discusses the objectives of INUP, gives brief descriptions of CEN facilities, the training programs conducted by INUP and list various research activities currently under way in the program.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The primary objective of the present study is to show that for the most common configuration of an impactor system, the accelerometer cannot exactly reproduce the dynamic response of a specimen subjected to impact loading. An equivalent Lumped Parameter Model (LPM) of the given impactor set-up has been formulated for assessing the accuracy of an accelerometer mounted in a drop-weight impactor set-up for an axially loaded specimen. A specimen under the impact loading is represented by a non-linear spring of varying stiffness, while the accelerometer is assumed to behave in a linear manner due to its high stiffness. Specimens made of steel, aluminium and fibre-reinforced composite (FRC) are used in the present study. Assuming the force-displacement response obtained in an actual impact test to be the true behaviour of the test specimen, a suitable numerical approach has been used to solve the governing non-linear differential equations of a three degrees-of-freedom (DOF) system in a piece-wise linear manner. The numerical solution of the governing differential equations following an explicit time integration scheme yields an excellent reproduction of the mechanical behaviour of the specimen, consequently confirming the accuracy of the numerical approach. However, the spring representing the accelerometer predicts a response that qualitatively matches the assumed force-displacement response of the test specimen with a perceptibly lower magnitude of load.