405 resultados para Dollar sunfish
Resumo:
The purpose of this research is to propose a procurement system across other disciplines and retrieved information with relevant parties so as to have a better co-ordination between supply and demand sides. This paper demonstrates how to analyze the data with an agent-based procurement system (APS) to re-engineer and improve the existing procurement process. The intelligence agents take the responsibility of searching the potential suppliers, negotiation with the short-listed suppliers and evaluating the performance of suppliers based on the selection criteria with mathematical model. Manufacturing firms and trading companies spend more than half of their sales dollar in the purchase of raw material and components. Efficient data collection with high accuracy is one of the key success factors to generate quality procurement which is to purchasing right material at right quality from right suppliers. In general, the enterprises spend a significant amount of resources on data collection and storage, but too little on facilitating data analysis and sharing. To validate the feasibility of the approach, a case study on a manufacturing small and medium-sized enterprise (SME) has been conducted. APS supports the data and information analyzing technique to facilitate the decision making such that the agent can enhance the negotiation and suppler evaluation efficiency by saving time and cost.
Resumo:
Financial prediction has attracted a lot of interest due to the financial implications that the accurate prediction of financial markets can have. A variety of data driven modellingapproaches have been applied but their performance has produced mixed results. In this study we apply both parametric (neural networks with active neurons) and nonparametric (analog complexing) self-organisingmodelling methods for the daily prediction of the exchangerate market. We also propose acombinedapproach where the parametric and nonparametricself-organising methods are combined sequentially, exploiting the advantages of the individual methods with the aim of improving their performance. The combined method is found to produce promising results and to outperform the individual methods when tested with two exchangerates: the American Dollar and the Deutche Mark against the British Pound.
Resumo:
The purpose of this thesis is to shed more light in the FX market microstructure by examining the determinants of bid-ask spread for three currencies pairs, the US dollar/Japanese yen, the British pound/US dollar and the Euro/US dollar in different time zones. I examine the commonality in liquidity with the elaboration of FX market microstructure variables in financial centres across the world (New York, London, Tokyo) based on the quotes of three exchange rate currency pairs over a ten-year period. I use GARCH (1,1) specifications, ICSS algorithm, and vector autoregression analysis to examine the effect of trading activity, exchange rate volatility and inventory holding costs on both quoted and relative spreads. ICSS algorithm results show that intraday spread series are much less volatile compared to the intraday exchange rate series as the number of change points obtained from ICSS algorithm is considerably lower. GARCH (1,1) estimation results of daily and intraday bid-ask spreads, show that the explanatory variables work better when I use higher frequency data (intraday results) however, their explanatory power is significantly lower compared to the results based on the daily sample. This suggests that although daily spreads and intraday spreads have some common determinants there are other factors that determine the behaviour of spreads at high frequencies. VAR results show that there are some differences in the behaviour of the variables at high frequencies compared to the results from the daily sample. A shock in the number of quote revisions has more effect on the spread when short term trading intervals are considered (intra-day) compared to its own shocks. When longer trading intervals are considered (daily) then the shocks in the spread have more effect on the future spread. In other words, trading activity is more informative about the future spread when intra-day trading is considered while past spread is more informative about the future spread when daily trading is considered
Resumo:
Models for the conditional joint distribution of the U.S. Dollar/Japanese Yen and Euro/Japanese Yen exchange rates, from November 2001 until June 2007, are evaluated and compared. The conditional dependency is allowed to vary across time, as a function of either historical returns or a combination of past return data and option-implied dependence estimates. Using prices of currency options that are available in the public domain, risk-neutral dependency expectations are extracted through a copula repre- sentation of the bivariate risk-neutral density. For this purpose, we employ either the one-parameter \Normal" or a two-parameter \Gumbel Mixture" specification. The latter provides forward-looking information regarding the overall degree of covariation, as well as, the level and direction of asymmetric dependence. Specifications that include option-based measures in their information set are found to outperform, in-sample and out-of-sample, models that rely solely on historical returns.
Resumo:
A tanulmány nem az aktuális hitelpiaci válság enyhítésének kérdésével foglalkozik, hanem az amerikai gazdaság elmúlt négy évtizedének általános és az utolsó tíz évének konkrét beruházási-megtakarítási és növekedési tendenciáit igyekszik feltárni. Azt vizsgálja, hogy milyen mélyebb, belföldi eredetű szerkezeti okai vannak a nemzetközivé dagadt jelzáloghitel-válságnak. A cikk a nyitott gazdaság külső finanszírozással összefüggő mérlegazonosságainak alapján arra a következtetésre jut, hogy az ingatlanpiaci visszaesés és a kibocsátás zsugorodása az Egyesült Államok gazdaságában már több mint másfél évtizede kialakult kedvezőtlen, de még tovább romló belföldi megtakarítási folyamatok következménye. A jelzálogpiac krízise és a lakásépítés drámai visszaesése a túlfogyasztásra és túlhitelezésre ösztönző pénzügyi környezet eredménye. A lakáspiaci és a hitelezési ciklusok pénzügyi innovációkkal történő megnyújtása inkább növelte, mint csökkentette a kibocsátásingadozás érzékenységét. A legfőbb hitelezők Kína, Japán, Németország inkább dolláralapú amerikai vállalati felvásárlásokkal ellensúlyozták a dollárgyengülésből elszenvedett veszteségeiket. 1997-2007 között az Amerikából külföldön befektetett dolláraktívák - javarészt a valuta leértékelődése nyomán - jelentős hozamemelkedést élveztek, és számottevően tompították a belföldön képződött jövedelmek csökkenését. A dollárleértékelődés az eszköz- (és nemcsak az áruexport) oldalon is előnyöket hozott számos nagyvállalatnak. / === / Rather than dealing with the immediate policy steps to dampen the crisis, this paper attempts to reveal the worsening savings/consumption pattern of the US economy over the last ten years. Based on the closed logic of open-economy GDP-accounting, it argues that the current crisis is deeply rooted in shrinking public and private savings trends discernible as early as 1997. The current mortgage-market crisis and deep fall in new residential housing are products of a distorted financial environment that encourages over-borrowing and over-consumption. Expansion of the credit cycle through successive financial innovations has increased, not decreased output volatility. But the main foreign lenders to the US—Japan, China and Germany—have managed to offset their losses on US securities by buying into US companies. Large US firms have also benefited from rapid dollar depreciation as USD-denominated yields on their foreign assets experienced strong run-ups. The weak dollar has also helped American firms with large assets on foreign markets. So there were strong benefits for the US, not just on the goods-export side, but on the asset side, an aspect rarely emphasized.
Resumo:
A sávosan rögzített devizaárfolyamok elméleti és gyakorlati vizsgálatai a nemzetközi közgazdaságtan egyik legnépszerűbb témaköre volt a kilencvenes évek elején. A gyakorlati módszerek közül az alkalmazások és hivatkozások száma tekintetében az úgynevezett eltolódással igazítás módszere emelkedett ki. A módszert alkalmazó szerzők szerint amíg a lebegő árfolyamú devizák előrejelzése céltalan feladatnak tűnik, addig sávos árfolyam esetén az árfolyam sávon belüli helyzetének előrejelzése sikeresen végezhető. E tanulmány bemutatja, hogy az Európai Monetáris Rendszer és az északeurópai államok sávos árfolyamrendszereinél e módszer alkalmazásával adódott eredmények például a lebegő árfolyamú amerikai dollárra és az egységgyökfolyamatok többségére is érvényesek. A tanulmány feltárja e látszólagos ellentmondás okait, és bemutat egy olyan, a sávos árfolyamrendszerek főbb megfigyelt jellemzőire épülő modellt, amelynek keretei között a sávon belüli árfolyam előrejelzése nem feltétlenül lehetséges, mert a leértékelés előtti időszakban a sávon belüli árfolyam alakulása kaotikus lehet. / === / Following the development of the first exchange rate target zone model at the end of the eighties dozens of papers analyzed theoretical and empirical topics of currency bands. This paper reviews different empirical methods to analyze the credibility of the band and lays special emphasis on the most widely used method, the so-called drift-adjustment method. Papers applying that method claim that while forecasting a freely floating currency is hopeless, predicting an exchange rate within the future band is successful. This paper shows that the results achieved by applications to EMS and Nordic currencies are not specific to data of target zone currencies. For example, application to US dollar and even to most unit root processes leads qualitatively to the same. This paper explores the solutions of this puzzle and shows a model of target zones in which the exchange rate within the band is not necessarily predictable since the process might follow chaotic dynamics before devaluation.
Resumo:
Press Release from Florida International University 's Office of Media Relations announcing North Dade Medical Foundation's $5 million dollar donation to establish scholarships for medical students and endowed chairs at Florida International University 's College of Medicine.
Resumo:
With the rapid growth of the Internet, computer attacks are increasing at a fast pace and can easily cause millions of dollar in damage to an organization. Detecting these attacks is an important issue of computer security. There are many types of attacks and they fall into four main categories, Denial of Service (DoS) attacks, Probe, User to Root (U2R) attacks, and Remote to Local (R2L) attacks. Within these categories, DoS and Probe attacks continuously show up with greater frequency in a short period of time when they attack systems. They are different from the normal traffic data and can be easily separated from normal activities. On the contrary, U2R and R2L attacks are embedded in the data portions of the packets and normally involve only a single connection. It becomes difficult to achieve satisfactory detection accuracy for detecting these two attacks. Therefore, we focus on studying the ambiguity problem between normal activities and U2R/R2L attacks. The goal is to build a detection system that can accurately and quickly detect these two attacks. In this dissertation, we design a two-phase intrusion detection approach. In the first phase, a correlation-based feature selection algorithm is proposed to advance the speed of detection. Features with poor prediction ability for the signatures of attacks and features inter-correlated with one or more other features are considered redundant. Such features are removed and only indispensable information about the original feature space remains. In the second phase, we develop an ensemble intrusion detection system to achieve accurate detection performance. The proposed method includes multiple feature selecting intrusion detectors and a data mining intrusion detector. The former ones consist of a set of detectors, and each of them uses a fuzzy clustering technique and belief theory to solve the ambiguity problem. The latter one applies data mining technique to automatically extract computer users’ normal behavior from training network traffic data. The final decision is a combination of the outputs of feature selecting and data mining detectors. The experimental results indicate that our ensemble approach not only significantly reduces the detection time but also effectively detect U2R and R2L attacks that contain degrees of ambiguous information.
Resumo:
It is projected that by 2020, there will be 138 million Americans over 45, the age at which the increased incidence of heart diseases is documented. Many will require stents. This multi-billion dollar industry, with over 2 million patients worldwide, 15% of whom use Nitinol stents have experienced a decline in sales recently, due in part to thrombosis. It is a sudden blood clot that forms inside stents. As a result, the Food and Drug Administration and American Heart Association are calling for a new generation of stents, new designs and different alloys that are more adaptable to the arteries. The future of Nitinol therefore depends on a better understanding of the mechanisms by which Nitinol surfaces can be rendered stable and inert. In this investigation, binary and ternary Nitinol alloys were prepared and subjected to various surface treatments such as electropolishing (EP), magnetoelectropolishing (MEP) and water boiling & passivation (W&P). In vitro corrosion tests were conducted on Nitinol alloys in accordance with ASTM F 2129-08. The metal ions released into the electrolyte during corrosion tests were measured by Inductively Coupled Plasma Mass Spectroscopy (ICP-MS). Biocompatibility was assessed by observing the growth of human umbilical vein endothelial cells (HUVEC) on the surface of Nitinol alloys. Static and dynamic immersion tests were performed by immersing the Nitinol alloys in cell culture media and measuring the amount of metal ions released in solution. Sulforhodamine B (SRB) assays were performed to elucidate the effect of metal ions on the growth of HUVEC cells. The surfaces of the alloys were studied using Scanning Electron Microscopy (SEM) and X-ray Photoelectron Spectroscopy (XPS) respectively. Finally, wettability and surface energy were measured by Contact Angle Meter, whereas surface roughness was measured by Atomic Force Microscopy (AFM). All the surface treated alloys exhibited high resistance to corrosion when compared with untreated alloys. SRB assays revealed that Ni and Cu ions exhibited greater toxicity than Cr, Ta and Ti ions on HUVEC cells. EP and MEP alloys possessed relatively smooth surfaces and some were composed of nickel oxides instead of elemental nickel as determined by XPS. MEP exhibited lowest surface energy and lowest surface roughness.
Resumo:
Over the last century, the Everglades underwent a metaphorical and ecological transition from impenetrable swamp to endangered wetland. At the heart of this transformation lies the Florida sugar industry, which by the 1990s was at the center of the political storm over the multi-billion dollar ecological “restoration” of the Everglades. Raising Cane in the ’Glades is the first study to situate the environmental transformation of the Everglades within the economic and historical geography of global sugar production and trade. Using, among other sources, interviews, government and corporate documents, and recently declassified U.S. State Department memoranda, Gail M. Hollander demonstrates that the development of Florida’s sugar region was the outcome of pitched battles reaching the highest political offices in the U.S. and in countries around the world, especially Cuba—which emerges in her narrative as a model, a competitor, and the regional “other” to Florida’s “self.” Spanning the period from the age of empire to the era of globalization, the book shows how the “sugar question”—a label nineteenth-century economists coined for intense international debates on sugar production and trade—emerges repeatedly in new guises. Hollander uses the sugar question as a thread to stitch together past and present, local and global, in explaining Everglades transformation.
Resumo:
Developing scientifically credible tools for measuring the success of ecological restoration projects is a difficult and a non-trivial task. Yet, reliable measures of the general health and ecological integrity of ecosystems are critical for assessing the success of restoration programs. The South Florida Ecosystem Restoration Task Force (Task Force), which helps coordinate a multi-billion dollar multi-organizational effort between federal, state, local and tribal governments to restore the Florida Everglades, is using a small set of system-wide ecological indicators to assess the restoration efforts. A team of scientists and managers identified eleven ecological indicators from a field of several hundred through a selection process using 12 criteria to determine their applicability as part of a system-wide suite. The 12 criteria are: (1) is the indicator relevant to the ecosystem? (2) Does it respond to variability at a scale that makes it applicable to the entire system? (3) Is the indicator feasible to implement and is it measureable? (4) Is the indicator sensitive to system drivers and is it predictable? (5) Is the indicator interpretable in a common language? (6) Are there situations where an optimistic trend with regard to an indicator might suggest a pessimistic restoration trend? (7) Are there situations where a pessimistic trend with regard to an indicator may be unrelated to restoration activities? (8) Is the indicator scientifically defensible? (9) Can clear, measureable targets be established for the indicator to allow for assessments of success? (10) Does the indicator have specificity to be able to result in corrective action? (11) What level of ecosystem process or structure does the indicator address? (12) Does the indicator provide early warning signs of ecological change? In addition, a two page stoplight report card was developed to assist in communicating the complex science inherent in ecological indicators in a common language for resource managers, policy makers and the public. The report card employs a universally understood stoplight symbol that uses green to indicate that targets are being met, yellow to indicate that targets have not been met and corrective action may be needed and red to represent that targets are far from being met and corrective action is required. This paper presents the scientific process and the results of the development and selection of the criteria, the indicators and the stoplight report card format and content. The detailed process and results for the individual indicators are presented in companion papers in this special issue of Ecological Indicators.
Resumo:
This paper examines the history of U.S. interventions in Latin America and attempts to explain their frequency by highlighting two factors – besides security and economic interests – that have made American interventions in Latin America so common. First, immense differences in size and influence between the United States and the States of Latin America have made interventions appear to be a low risk solution to crises that threaten American interests in the region. Second, when U.S government concerns and aspirations for Latin America converge with the general fears and aspirations of American foreign policy, interventions become much more likely. Such a convergence pushes Latin American issues high up the U.S. foreign policy agenda because of the region’s proximity to the United States and the perception that costs of intervening are low. The leads proponents of intervention to begin asking questions like “if we cannot stop communism/revolutions/drug-trafficking in Latin America, where can we stop it?” This article traces how these factors influenced the decision to intervene in Latin America during the era of Dollar Diplomacy and during the Cold War. It concludes with three possible scenarios that could lead to a reemergence of an American interventionist policy in Latin America. It makes the argument that even though the United Sates has not intervened in Latin America during the twenty-two years, it is far from clear that American interventions in Latin America will be consigned to the past.
Resumo:
Restaurant commissaries range the full spectrum from simple storage of food and supplies to multi-million-dollar processing plants. The author discusses the cost effectiveness of commissary units, including their operating costs, quality control, and scope.
Resumo:
The phenomenon of at-destination search activity and decision processes utilized by visitors to a location is predominantly an academic unknown. As destinations and organizations increasingly compete for their share of the travel dollar, it is evident that more research need to be done regarding how consumers obtain information once they arrive at a destination. This study examined visitor referral recommendations provided by hotel and non-hotel ''locals" in a moderately-sized community for lodging, food service, and recreational and entertainment venues.
Resumo:
In his study - File Control: The Heart Of Business Computer Management - William G. O'Brien, Assistant Professor, The School of Hospitality Management at Florida International University, initially informs you: “Even though computers are an everyday part of the hospitality industry, many managers lack the knowledge and experience to control and protect the files in these systems. The author offers guidelines which can minimize or prevent damage to the business as a whole.” Our author initially opens this study with some anecdotal instances illustrating the failure of hospitality managers to exercise due caution with regard to computer supported information systems inside their restaurants and hotels. “Of the three components that make up any business computer system (data files, programs, and hard-ware), it is files that are most important, perhaps irreplaceable, to the business,” O’Brien informs you. O’Brien breaks down the noun, files, into two distinct categories. They are, the files of extrinsic value, and its counterpart the files of intrinsic value. An example of extrinsic value files would be a restaurant’s wine inventory. “As sales are made and new shipments are received, the computer updates the file,” says O’Brien. “This information might come directly from a point-of-sale terminal or might be entered manually by an employee,” he further explains. On the intrinsic side of the equation, O’Brien wants you to know that the information itself is the valuable part of this type of file. Its value is over and above the file’s informational purpose as a pragmatic business tool, as it is in inventory control. “The information is money in the legal sense For instance, figures moved about in banking system computers do not represent dollars; they are dollars,” O’Brien explains. “If the record of a dollar amount is erased from all computer files, then that money ceases to exist,” he warns. This type of information can also be bought and sold, such as it is in customer lists to advertisers. Files must be protected O’Brien stresses. “File security requires a systematic approach,” he discloses. O’Brien goes on to explain important elements to consider when evaluating file information. File back-up is also an important factor to think about, along with file storage/safety concerns. “Sooner or later, every property will have its fire, flood, careless mistake, or disgruntled employee,” O’Brien closes. “…good file control can minimize or prevent damage to the business as a whole.”