900 resultados para Commodity currencies
Resumo:
Very large spatially-referenced datasets, for example, those derived from satellite-based sensors which sample across the globe or large monitoring networks of individual sensors, are becoming increasingly common and more widely available for use in environmental decision making. In large or dense sensor networks, huge quantities of data can be collected over small time periods. In many applications the generation of maps, or predictions at specific locations, from the data in (near) real-time is crucial. Geostatistical operations such as interpolation are vital in this map-generation process and in emergency situations, the resulting predictions need to be available almost instantly, so that decision makers can make informed decisions and define risk and evacuation zones. It is also helpful when analysing data in less time critical applications, for example when interacting directly with the data for exploratory analysis, that the algorithms are responsive within a reasonable time frame. Performing geostatistical analysis on such large spatial datasets can present a number of problems, particularly in the case where maximum likelihood. Although the storage requirements only scale linearly with the number of observations in the dataset, the computational complexity in terms of memory and speed, scale quadratically and cubically respectively. Most modern commodity hardware has at least 2 processor cores if not more. Other mechanisms for allowing parallel computation such as Grid based systems are also becoming increasingly commonly available. However, currently there seems to be little interest in exploiting this extra processing power within the context of geostatistics. In this paper we review the existing parallel approaches for geostatistics. By recognising that diffeerent natural parallelisms exist and can be exploited depending on whether the dataset is sparsely or densely sampled with respect to the range of variation, we introduce two contrasting novel implementations of parallel algorithms based on approximating the data likelihood extending the methods of Vecchia [1988] and Tresp [2000]. Using parallel maximum likelihood variogram estimation and parallel prediction algorithms we show that computational time can be significantly reduced. We demonstrate this with both sparsely sampled data and densely sampled data on a variety of architectures ranging from the common dual core processor, found in many modern desktop computers, to large multi-node super computers. To highlight the strengths and weaknesses of the diffeerent methods we employ synthetic data sets and go on to show how the methods allow maximum likelihood based inference on the exhaustive Walker Lake data set.
Resumo:
The purpose of this thesis is to shed more light in the FX market microstructure by examining the determinants of bid-ask spread for three currencies pairs, the US dollar/Japanese yen, the British pound/US dollar and the Euro/US dollar in different time zones. I examine the commonality in liquidity with the elaboration of FX market microstructure variables in financial centres across the world (New York, London, Tokyo) based on the quotes of three exchange rate currency pairs over a ten-year period. I use GARCH (1,1) specifications, ICSS algorithm, and vector autoregression analysis to examine the effect of trading activity, exchange rate volatility and inventory holding costs on both quoted and relative spreads. ICSS algorithm results show that intraday spread series are much less volatile compared to the intraday exchange rate series as the number of change points obtained from ICSS algorithm is considerably lower. GARCH (1,1) estimation results of daily and intraday bid-ask spreads, show that the explanatory variables work better when I use higher frequency data (intraday results) however, their explanatory power is significantly lower compared to the results based on the daily sample. This suggests that although daily spreads and intraday spreads have some common determinants there are other factors that determine the behaviour of spreads at high frequencies. VAR results show that there are some differences in the behaviour of the variables at high frequencies compared to the results from the daily sample. A shock in the number of quote revisions has more effect on the spread when short term trading intervals are considered (intra-day) compared to its own shocks. When longer trading intervals are considered (daily) then the shocks in the spread have more effect on the future spread. In other words, trading activity is more informative about the future spread when intra-day trading is considered while past spread is more informative about the future spread when daily trading is considered
Resumo:
A systematic analysis is presented of the economic consequences of the abnormally high concentration of Zambia's exports on a commodity whose price is exceptionally unstable. Zambian macro-economic variables in the post-independence years are extensively documented, showing acute instability and decline, particularly after the energy price revolution and the collapse of copper prices. The relevance of stabilization policies designed to correct short-term disequilibrium is questioned. It is, therefore, a pathological case study of externally induced economic instability, complementing other studies in this area which use cross-country analysis of a few selected variables. After a survey of theory and issues pertaining to development, finance and stabilization, the emergence of domestic and foreign financial constraints on the Zambian economy is described. The world copper industry is surveyed and an examination of commodity and world trade prices concludes that copper showed the highest degree of price instability. Specific aspects of Zambia's economy identified for detailed analysis include: its unprofitable mining industry, external payments disequilibrium, a constrained government budget, potentially inflationary monetary growth, and external indebtedness. International comparisons are used extensively, but major copper exporters are subjected to closer scrutiny. An appraisal of policy options concludes the study.
Resumo:
The properties of statistical tests for hypotheses concerning the parameters of the multifractal model of asset returns (MMAR) are investigated, using Monte Carlo techniques. We show that, in the presence of multifractality, conventional tests of long memory tend to over-reject the null hypothesis of no long memory. Our test addresses this issue by jointly estimating long memory and multifractality. The estimation and test procedures are applied to exchange rate data for 12 currencies. Among the nested model specifications that are investigated, in 11 out of 12 cases, daily returns are most appropriately characterized by a variant of the MMAR that applies a multifractal time-deformation process to NIID returns. There is no evidence of long memory.
Resumo:
Academic researchers have followed closely the interest of companies in establishing industrial networks by studying aspects such as social interaction and contractual relationships. But what patterns underlie the emergence of industrial networks and what support should research provide for practitioners? First, it appears that manufacturing is becoming a commodity rather than a unique capability, which accounts especially for low-technology approaches in downstream parts of the network, for example, in assembly operations. Second, the increased tendency towards specialisation has forced other, upstream, parts of industrial networks to introduce advanced manufacturing technologies for niche markets. Third, the capital market for investments in capacity, and the trade in manufacturing as a commodity, dominates resource allocation to a larger extent than was previously the case. Fourth, there is becoming a continuous move towards more loosely connected entities that comprise manufacturing networks. Finally, in these networks, concepts for supply chain management should address collaboration and information technology that supports decentralised decision-making, in particular to address sustainable and green supply chains. More traditional concepts, such as the keiretsu and chaibol networks of some Asian economies, do not sufficiently support the demands now being placed on networks. Research should address these five fundamental challenges to prepare for the industrial networks of 2020 and beyond. © 2010 Springer-Verlag London.
Resumo:
Book revew: Marketinggeschichte: die Genese einer modernen Sozialtechnik [Marketing history: The genesis of a modern social technique], edited by Hartmut Berghoff, Frankfurt/Main, Campus Verlag, 2007, 409 pp., illus., [euro]30.00 (paperback), ISBN 978-3-593-38323-1. This edited volume is the result of a workshop at Göttingen University in 2006 and combines a number of different approaches to the research into the history of marketing in Germany's economy and society. The majority of contributions loosely focus around the occurrence of a ‘marketing revolution’ in the 1970s, which ties in with interpretations of the Americanisation of German business. This revolution replaced the indigenous German idea of Absatzwirtschaft (the economics of sales) with the American-influenced idea of Marketing, which was less functionally oriented and more strategic, and which aimed to connect processes within the firm in order to allow a greater focus on the consumer. The entire volume is framed by Hartmut Berghoff's substantial and informative introduction, which introduces a number of actors and trends beyond the content of the volume. Throughout the various contributions, authors provide explanations of the timing and nature of marketing revolutions. Alexander Engel identifies an earlier revolution in the marketing of dyes, which undergoes major change with the emergence of chemical dyes. While the natural dyestuff had been a commodity, with producers removed from consumers via a global network of traders, chemical dyes were products and were branded at an early stage. This was a fundamental change in the nature of production and sales. As Roman Rossfeld shows in his contribution on the Swiss chocolate industry (which focuses almost exclusively on Suchard), even companies that produced non-essential consumer goods which had always required some measure of labelling grappled for years with the need to develop fewer and higher impact brands, as well as an efficient sales operation. A good example for the classical ‘marketing revolution’ of the 1970s is the German automobile industry. Ingo Köhler convincingly argues that the crisis situation of German car manufacturers – the change from a seller's to a buyer's market, appreciation of the German mark which undermines exports, the oil crises coupled with higher inflation and greater frugality of consumers and the emergence of new competitors – lead companies to refocus from production to the demands of the consumer. While he highlights the role of Ford in responding most rapidly to these problems, he does not address whether the multinational was potentially transferring American knowledge to the German market. Similarly, Paul Erker illustrates that a marketing revolution in transport and logistics happened much later, because the market remained highly regulated until the 1980s. Both Paul Erker and Uwe Spiekermann in their contribution, present comparisons of two different sectors or companies (the tire manufacturer Continental and the logistics company Dachser, and agriculture and trade, respectively). In both cases, however, it remains unclear why these examples were chosen for comparison, as both seem to have little in common and are not always effectively used to demonstrate differences. The weakest section of the book is the development of marketing as an academic discipline. The attempt at sketching the phases in the evolution of marketing as an academic discipline by Ursula Hansen and Matthias Bode opens with an undergraduate-level explanation on the methodology of historical periodisation that seems extraneous. Considerably stronger is the section on the wider societal impact of marketing, and Anja Kruke shows how the new techniques of opinion research was accepted by politics and business – surprisingly more readily by politicians than their commercial counterparts. In terms of contemporary personalities, Hans Domizlaff emerges as one fascinating figure of German marketing history, which several contributors refer to and whose career as the German cigarette manufacturer Reemtsma is critically analysed by Tino Jacobs. Domizlaff was Germany's own ‘marketing guru’, whose successful campaigns led to the wide-ranging reception of his ideas about the nature of good branding and marketing. These are variously described as intuitive, elitist, and sachlich, a German concept of a sober, fact-based, and ‘no frills’ approach. Domizlaff did not believe in market research. Rather, he saw the genius of the individual advertiser as key to intuitively ascertaining the people's moods, wishes, and desires. This seems to have made him peculiarly suited to the tastes of the German middle class, according to Thomas Mergel's contribution on the nature of political marketing in the republic. Especially in politics, any form of hard sales tactics were severely frowned upon and considered to demean the citizen as incapable of making an informed choice, a mentality that he dates back to the traditions of nineteenth-century liberalism. Part of this disdain of ‘selling politics like toothpaste’ was also founded on the highly effective use of branding by the National Socialists, who identified their party through the use of an increasingly standardised image of Adolf Hitler and the swastika. Alexander Schug extends on previous research that criticised the simplistic notion of Hitler's charisma as the only explanation of the popular success and distances his approach from those who see it in terms of propaganda and demagogy. He argues that the NSDAP used the tools of advertising and branding precisely because they had to introduce their new ideology into a political marketplace dominated by more established parties. In this they were undoubtedly successful, more so than they intended: as bakers sold swastika cookies and butchers formed Führer heads out of lard, the NSDAP sought to regain control over the now effectively iconic images that constituted their brand, which was in danger of being trivialised and devalued. Key to understanding the history of marketing in Germany is on the one hand the exchange of ideas with the United States, and on the other the impact of national-socialist policies, and the question whether they were a force of modernisation or retardation. The general argument in the volume appears to favour the latter explanation. In the 1930s, some of the leading marketing experts emigrated to the USA, leaving German academia and business isolated. The aftermath of the Second World War left a country that needed to increase production to satisfy consumer demand, and there was little interest in advanced sales techniques. Although the Nazis were progressive in applying new marketing methods to their political campaign, this retarded the adoption of sales techniques in politics for a long time. Germany saw the development of idiosyncratic approaches by people like Domizlaff in the 1930s and 1940s, when it lost some leading thinkers, and only engaged with American marketing conceptions in the 1960s and 1970s, when consumers eventually became more important than producers.
Resumo:
Academia has followed the interest by companies in establishing industrial networks by studying aspects such as social interaction and contractual relationships. But what patterns underlie the emergence of industrial networks and what support should research provide for practitioners? Firstly, it seems that manufacturing is becoming a commodity rather than a unique capability, which accounts especially for low-technology approaches in downstream parts of the network, for example in assembly operations. Secondly, the increased tendency to specialize forces other parts of industrial networks to introduce advanced manufacturing technologies for niche markets. Thirdly, the capital market for investments in capacity and the trade in manufacturing as a commodity dominates resource allocation to a larger extent. Fourthly, there will be a continuous move toward more loosely connected entities forming manufacturing networks. More traditional concepts, like keiretsu and chaibol networks, do not sufficiently support this transition. Research should address these fundamental challenges to prepare for the industrial networks of 2020 and beyond.
Resumo:
Purpose – The purpose of this paper is to examine the effect of firm size and foreign operations on the exchange rate exposure of UK non-financial companies from January 1981 to December 2001. Design/methodology/approach – The impact of the unexpected changes in exchange rates on firms’ stock returns is examined. In addition, the movements in bilateral, equally weighted (EQW) and trade-weighted and exchange rate indices are considered. The sample is classified according to firm size and the extent of firms’ foreign operations. In addition, structural changes on the relationship between exchange rate changes and individual firms’ stock returns are examined over three sub-periods: before joining the exchange rate mechanism (pre-ERM), during joining the ERM (in-ERM), and after departure from the ERM (post-ERM). Findings – The findings indicate that a higher percentage of UK firms are exposed to contemporaneous exchange rate changes than those reported in previous studies. UK firms’ stock returns are more affected by changes in the EQW, and US$ European currency unit exchange rate, and respond less significantly to the basket of 20 countries’ currencies relative to the UK pound exchange rate. It is found that exchange rate exposure has a more significant impact on stock returns of the large firms compared with the small and medium-sized companies. The evidence is consistent across all specifications using different exchange rate. The results provide evidence that the proportion of significant foreign exchange rate exposure is higher for firms which generate a higher percentage of revenues from abroad. The sensitivities of firms’ stock returns to exchange rate fluctuations are most evident in the pre-ERM and post-ERM periods. Practical implications – This study provides important implications for public policymakers, financial managers and investors on how common stock returns of various sectors react to exchange rate fluctuations. Originality/value – The empirical evidence supports the view that UK firms’ stock returns are affected by foreign exchange rate exposure.
Resumo:
The first resonant-cavity time-division-multiplexed (TDM) fiber Bragg grating sensor interrogation system is reported. This novel design uses a pulsed semiconductor optical amplifier in a cyclic manner to function as the optical source, amplifier, and modulator. Compatible with a range of standard wavelength detection techniques, this optically gated TDM system allows interrogation of low reflectivity "commodity" sensors spaced just 2 m apart, using a single active component. Results demonstrate an exceptional optical signal-to-noise ratio of 36 dB, a peak signal power of over +7 dBm, and no measurable crosstalk between sensors. Temperature tuning shows that the system is fully stable with a highly linear response. © 2004 IEEE.
Resumo:
The quest for sustainable resources to meet the demands of a rapidly rising global population while mitigating the risks of rising CO2 emissions and associated climate change, represents a grand challenge for humanity. Biomass offers the most readily implemented and low-cost solution for sustainable transportation fuels, and the only non-petroleum route to organic molecules for the manufacture of bulk, fine and speciality chemicals and polymers. To be considered truly sustainable, biomass must be derived fromresources which do not compete with agricultural land use for food production, or compromise the environment (e.g. via deforestation). Potential feedstocks include waste lignocellulosic or oil-based materials derived from plant or aquatic sources, with the so-called biorefinery concept offering the co-production of biofuels, platform chemicals and energy; analogous to today's petroleum refineries which deliver both high-volume/low-value (e.g. fuels and commodity chemicals) and lowvolume/ high-value (e.g. fine/speciality chemicals) products, thereby maximizing biomass valorization. This article addresses the challenges to catalytic biomass processing and highlights recent successes in the rational design of heterogeneous catalysts facilitated by advances in nanotechnology and the synthesis of templated porous materials, as well as the use of tailored catalyst surfaces to generate bifunctional solid acid/base materials or tune hydrophobicity.
Resumo:
The properties of statistical tests for hypotheses concerning the parameters of the multifractal model of asset returns (MMAR) are investigated, using Monte Carlo techniques. We show that, in the presence of multifractality, conventional tests of long memory tend to over-reject the null hypothesis of no long memory. Our test addresses this issue by jointly estimating long memory and multifractality. The estimation and test procedures are applied to exchange rate data for 12 currencies. In 11 cases, the exchange rate returns are accurately described by compounding a NIID series with a multifractal time-deformation process. There is no evidence of long memory.
Resumo:
Much has been written about the potential impact of Lean Agile paradigm on firm's supply chain performance. However, most of the existing studies mainly pointed out Lean is for cost reduction, whereas Agility is for attaining flexibility. There are little empirical studies in literature that examined how Lean Agile paradigm impacts on supply chain performance. This paper aims to address this gap by studying the influence of Lean and Agility paradigms on a single commodity supply chain delivery performance in the aerospace industry. Data was collected from four separate 'Rigid pipes' supply chains to study how manufacturing alignment impacts on the delivery performance. Implications of the study to practitioners and academia are discussed and future research outlined.
Resumo:
The ‘currency war’, as it has become known, has three aspects: 1) the inflexible pegs of undervalued currencies; 2) recent attempts by floating exchange-rate countries to resist currency appreciation; 3) quantitative easing. Europe should primarily be concerned about the first issue, which relates to the renewed debate about the international monetary system. The attempts of floating exchange-rate countries to resist currency appreciation are generally justified while China retains a peg. Quantitative easing cannot be deemed a ‘beggar-thy-neighbour’ policy as long as the Fed’s policy is geared towards price stability. Current US inflationary expectations are at historically low levels. Central banks should come to an agreement about the definition of price stability at a time of deflationary pressures. The euro’s exchange rate has not been greatly impacted by the recent currency war; the euro continues to be overvalued, but less than before.
Resumo:
Az európai gazdasági integráció folyamata olyan kényszerhelyzetekben formálódott a múltban, amelyek a közgazdaságtudományban jól ismert lehetetlen háromság alapján is leírhatók. Az Európai Monetáris Rendszer a rögzített árfolyam-mechanizmusra és önálló jegybanki politikára épített, korlátozva a tőkemozgásokat. A Gazdasági és Monetáris Unió ugyanakkor a tőke szabad áramlásával és az árfolyamok visszavonhatatlan rögzítésével felszámolta a tagállami szintű jegybanki autonómiát. Az euróövezet működése egyszersmind arra a háromszoros tagadásra épül(t), hogy 1. nem lehetséges az euróövezetből való kilépés, 2. nem engedélyezett a kimentés és 3. nem kerülhet sor államcsődre. A 2008-ban Európát is elérő pénzügyi és gazdasági válság azonban elemi erővel mutatott rá e hármas tiltás tarthatatlanságára. A gazdasági kormányzás körül kibontakozott viták így jól közelíthetők a három tiltó szabály egyidejű érvényesülése lehetetlenségének bemutatásával, számba véve az egyes opciók költségeit és lehetséges hasznait. / === / The process of economic integration in the EU has been shaped by the well-known theorem of the impossible trinity. Accordingly, the European Monetary System was built upon a mix of a fixed exchange-rate regime and an autonomous monetary policy, thereby constraining capital mobility. In launching the EMU project, the EU countries decided to fix national currencies irrevocably and maintain full capital mobility, in exchange for delegating their monetary policy upwards to a supranational level. The introduction of the Euro zone, however, has simultaneously meant denial of the following three elements: (1) exit, (2) bail-out, and (3) default. Nevertheless, the 2008–9 financial and economic crisis has demonstrated mercilessly that these three pillars are incompatible with each other. So the current debates on reshaping economic governance in the EU can be modelled by introducing the “impossible trinity of denial”, concentrating on the benefits and the costs of each option.
Resumo:
Kisebb bizonytalankodás után a legtöbb közgazdászhallgató a pénz funkcióinak felsorolásába kezd, ha megkérdezik, hogyan határozná meg a pénz fogalmát. A gyakorlatiasabb, vagy a számvitel iránt elkötelezettebb diákok esetleg felidézik a banki mérleget és – részben helyesen – a pénzt kötelezettségként helyezik el benne. Mintha azonban még mindig egy kicsit pironkodnánk, hogy nem találjuk a megfelelő definíciót. És ez már így megy évszázadok óta. Jelen tanulmányban két XIX. századi közgazdász – Karl Marx és Karl Menger – néhány pénzelméleti következtetését igyekszem összehasonlítani, figyelembe véve az általuk képviselt közgazdasági elmélet alapvető eltéréseit. A mára általánosan elfogadottá váló szubjektív értékelmélet és a kissé elfeledett munkaérték-elmélet látszólag teljesen eltérő feltevéseire alapozva a két gondolkodó egészen hasonló eredményre jutott. Számukra a pénz nem egy egyszerű eszköz, sem követelés és kötelezettség, ahogyan most elkönyvelnénk, hanem áru. Eredetét nem állami törvényekből vezetik le, hanem társadalmi konszenzus során létrejött jelenségnek tekintik a pénzt, ami fölötte áll a törvényeknek, eredendően nem jelképet testesít meg, hanem különleges jószágként válik alkalmassá értékjel kifejezésére. / === / If being asked how to define money most students of economics would start listing the functions of money, or those students with more practical insight would place money as liability in the balance sheet of banks. It seems, however, as if we were still embarrassed by not finding the right definition. In the present study I am endeavouring to give a brief overview of various theoretical findings on the essence of money in the economy preceding the 19th century and then compare some money theoretical conclusions of two economists – Karl Marx and Karl Menger – considering the major differences of the economic theories represented by them. On the basis of the premises of the widely accepted subjective value theory and the somewhat forgotten labour theory of value the two 19th century thinkers came to rather similar results. For them money is not a simple means of payment, nor liability or claim, the way we would account for them now, but a special commodity. They do not attach its creation to the appearance of state laws on money as a legal tender but regard it as a social phenomenon which became capable of expressing a value token due to its peculiar characteristics.