964 resultados para Data exchange formats
Resumo:
A sávosan rögzített devizaárfolyamok elméleti és gyakorlati vizsgálatai a nemzetközi közgazdaságtan egyik legnépszerűbb témaköre volt a kilencvenes évek elején. A gyakorlati módszerek közül az alkalmazások és hivatkozások száma tekintetében az úgynevezett eltolódással igazítás módszere emelkedett ki. A módszert alkalmazó szerzők szerint amíg a lebegő árfolyamú devizák előrejelzése céltalan feladatnak tűnik, addig sávos árfolyam esetén az árfolyam sávon belüli helyzetének előrejelzése sikeresen végezhető. E tanulmány bemutatja, hogy az Európai Monetáris Rendszer és az északeurópai államok sávos árfolyamrendszereinél e módszer alkalmazásával adódott eredmények például a lebegő árfolyamú amerikai dollárra és az egységgyökfolyamatok többségére is érvényesek. A tanulmány feltárja e látszólagos ellentmondás okait, és bemutat egy olyan, a sávos árfolyamrendszerek főbb megfigyelt jellemzőire épülő modellt, amelynek keretei között a sávon belüli árfolyam előrejelzése nem feltétlenül lehetséges, mert a leértékelés előtti időszakban a sávon belüli árfolyam alakulása kaotikus lehet. / === / Following the development of the first exchange rate target zone model at the end of the eighties dozens of papers analyzed theoretical and empirical topics of currency bands. This paper reviews different empirical methods to analyze the credibility of the band and lays special emphasis on the most widely used method, the so-called drift-adjustment method. Papers applying that method claim that while forecasting a freely floating currency is hopeless, predicting an exchange rate within the future band is successful. This paper shows that the results achieved by applications to EMS and Nordic currencies are not specific to data of target zone currencies. For example, application to US dollar and even to most unit root processes leads qualitatively to the same. This paper explores the solutions of this puzzle and shows a model of target zones in which the exchange rate within the band is not necessarily predictable since the process might follow chaotic dynamics before devaluation.
Resumo:
Az árhatásfüggvények azt mutatják meg, hogy egy adott értékű megbízás mekkora relatív árváltozást okoz. Az árhatásfüggvény ismerete a piaci szereplők számára fontos szerepet játszik a jövőben benyújtandó ajánlataikhoz kapcsolódó árhatás előrejelzésében, a kereskedés árváltozásból eredő többletköltségének becslésében, illetve az optimális kereskedési algoritmus kialakításában. Az általunk kidolgozott módszer révén a piaci szereplők a teljes ajánlati könyv ismerete nélkül egyszerűen és gyorsan tudnak virtuális árhatásfüggvényt meghatározni, ugyanis bemutatjuk az árhatásfüggvény és a likviditási mértékek kapcsolatát, valamint azt, hogy miként lehet a Budapesti Likviditási Mérték (BLM) idősorából ár ha tás függ vényt becsülni. A kidolgozott módszertant az OTP-részvény idősorán szemléltetjük, és a részvény BLM-adatsorából a 2007. január 1-je és 2011. június 3-a közötti időszakra virtuális árhatás függvényt becsülünk. Empirikus elemzésünk során az árhatás függ vény időbeli alakulásának és alapvető statisztikai tulajdonságainak vizsgálatát végezzük el, ami révén képet kaphatunk a likviditás hiányában fellépő tranzakciós költségek múltbeli viselkedéséről. Az így kapott információk például a dinamikus portfólióoptimalizálás során lehetnek a kereskedők segítségére. / === / Price-effect equations show what relative price change a commission of a given value will have. Knowledge of price-effect equations plays an important part in enabling market players to predict the price effect of their future commissions and to develop an optimal trading algorithm. The method devised by the authors allows a virtual price-effect equation to be defined simply and rapidly without knowledge of the whole offer book, by presenting the relation between the price-effect equation and degree of liquidity, and how to estimate the price-effect equation from the time line of the Budapest Liquidity Measure (BLM). The methodology is shown using the time line for OTP shares and the virtual price-effect equation estimated for the 1 January 2007 to 3 June 2011 period from the shares BML data set. During the empirical analysis the authors conducted an examination of the tendency of the price-effect equation over time and for its basic statistical attributes, to yield a picture of the past behaviour of the transaction costs arising in the absence of liquidity. The information obtained may, for instance, help traders in dynamic portfolio optimization.
Resumo:
A szerzők alapvető feltételezése, hogy az ellátási láncban a bizalom ösztönzi a felek közötti elkötelezettséget. Továbbá a bizalom növeli annak a lehetőségét, hogy az ellátási lánc sikeresen működjön. Ugyanakkor, a felek közötti bizalom hiánya gyakran megnöveli a tranzakciós költségeket, és így csökkenti a hatékonyságot. A cikkben bemutatott kutatás adatgyűjtésére több országban – Franciaországban, Magyarországon, Dél-Koreában, Tunéziában és az Egyesült Államokban – került sor. Összesen 729 érvényes kérdőív érkezett vissza, amelyeket a különböző ellátási láncok résztvevői töltöttek ki. A kutatási eredmények szerint az ellátási láncokon belüli üzleti kapcsolatokban a tranzakciós költségek elméletének összetevői (kapcsolatspecifikusság és viselkedési bizonytalanság), valamint a társadalmi csereelmélet tényezői (lecserélhetőség, észlelt megelégedettség, a partner hírneve és észlelt konfliktus) szoros kapcsolatban állnak a bizalom és az elkötelezettség változóival. _____ This article is based on the assumption that trust promotes commitment between partners in the supply chain and improves the chances of return on supply chain success. In contrast, a lack of trust between them often increases transaction costs and results in inefficiency. The results of this research, based on multi-country surveys with 729 returns from France, Hungary, Korea, Tunisia and the United States on supply chain professionals, reveals a strong affect of transaction cost constructs (TCC) (asset specificity and behavioral uncertainty) and social exchange constructs (SEC) (replaceability, perceived satisfaction, partner reputation, and perceived conflict) on trust-commitment variables in partnership based supply chain relationships. This paper employed a structural equation model to extract information from the survey data. Among the findings, the research indicates that a firm’s trustcommitment in dealing with their supply chain partnership is highly associated with not only transaction cost, but more so with social exchange variables. This study may open a new research avenue in that there is another construct, SEC, beside TSC that influences the degree of trust and commitment.
Resumo:
The global crisis of 2008 caused both liquidity shortage and increasing insolvency in the banking system. The study focuses on credit default contagion in the Central and Eastern European (CEE) region, which originated in bank runs generated by non-performing loans granted to non-financial clients. In terms of methodology, the paper relies on one hand on review of the literature, and on the other hand on a data survey with comparative and regression analysis. To uncover credit default contagion, the research focuses on the combined impact of foreign exchange rates and foreign private indebtedness.
Resumo:
The global crisis of 2008 caused both liquidity shortage and increasing insolvency in the banking system. The study focuses on credit default contagion in the Central and Eastern European (CEE) region, which originated in bank runs generated by non-performing loans granted to non-financial clients. In terms of methodology, the paper relies on one hand on review of the literature, and on the other hand on a data survey with comparative and regression analysis. To uncover credit default contagion, the research focuses on the combined impact of foreign exchange rates and foreign private indebtedness.
Resumo:
Az üzleti célú kapcsolatok vizsgálatakor legtöbbször felmerül a bizalom fogalma is, amellyel először a szociálpszichológia és a szociológia foglalkozott. A szerző cikkében azzal foglalkozik, hogy milyen tényezők befolyásolják a bizalmat az üzleti kapcsolatokban. Ezért magyarországi szervezeteket (elsősorban üzleti vállalkozásokat) kérdezett meg kvantitatív módszer alkalmazásával. Ez az empirikus kutatás igyekszik a bizalom lehető legtöbb tényezőit összegyűjteni, ahol a kérdőívben a válaszolók egyik létező – vevői vagy beszállítói – kapcsolatukra gondolva válaszoltak a bizalmon túl a partner hírnevére, az észlelt elégedettségre és konfliktusra, az információcserére, a lecserélhetőségre, valamint a kapcsolatspecifikus beruházásokra vonatkozóan. Az eredmények azt mutatják, hogy egyik változó sem jelent kizárólagos befolyásoló erőt, hanem közösen alakítják ki a bizalom adott szintjét. _______ This article is based on the assumption that trust promotes commitment between partners in the supply chain and improves the chances of return on supply chain success. In contrast, a lack of trust between them often increases transaction costs and results in inefficiency. The results of this research, based on multi-country surveys with 729 returns from France, Hungary, Korea, Tunisia and the United States on supply chain professionals, reveals a strong affect of transaction cost constructs (TCC) (asset specificity and behavioral uncertainty) and social exchange constructs (SEC) (replaceability, perceived satisfaction, partner reputation, and perceived conflict) on trust-commitment variables in partnership based supply chain relationships. This paper employed a structural equation model to extract information from the survey data. Among the findings, the research indicates that a firm’s trustcommitment in dealing with their supply chain partnership is highly associated with not only transaction cost, but more so with social exchange variables. This study may open a new research avenue in that there is another construct, SEC, beside TSC that influences the degree of trust and commitment.
Resumo:
This dissertation examines the behavior of the exchange rate under two different scenarios. The first one is characterized by, relatively, low inflation or a situation where prices adjust sluggishly. The second is a high inflation economy where prices respond very rapidly even to unanticipated shocks. In the first one, following a monetary expansion, the exchange rate overshoots, i.e. the nominal exchange rate depreciates at a faster pace than the price level. Under high levels of inflation, prices change faster than the exchange rate so the exchange rate undershoots its long run equilibrium value.^ The standard work in this area, Dornbusch (1976), explains the overshooting process in the context of perfect capital mobility and sluggish adjustment in the goods market. A monetary expansion will make the exchange rate increase beyond its long run equilibrium value. This dissertation expands on Dornbusch's model and provides an analysis of the exchange rate under conditions of currency substitution and price flexibility, characteristics of the Peruvian economy during the hyper inflation process that took place at the end of the 1980's. The results of the modified Dornbusch model reveal that, given a monetary expansion, the change in the price level will be larger than the change in the exchange rate if prices react more than proportionally to the monetary shock.^ We will expect this over-reaction in circumstances of high inflation when the velocity of money is increasing very rapidly. Increasing velocity of money, gives rise to a higher relative price variability which in turn contributes to the appearance of new financial (and also non-financial) instruments that report a higher return than the exchange rate, causing people to switch their demand for foreign exchange to this new assets. In the context of currency substitution, economic agents hoard and use foreign exchange as a store of value. The big decline in output originated by hyper inflation induces people to sell this hoarded money to finance current expenses, increasing the supply of foreign exchange in the market. Both, the decrease in demand and the increase in supply reduce the price of foreign exchange i.e. the real exchange rate. The findings mentioned above are tested using Peruvian data for the period January 1985-July 1990, the results of the econometric estimation confirm our findings in the theoretical model. ^
Resumo:
A two-phase three-dimensional computational model of an intermediate temperature (120--190°C) proton exchange membrane (PEM) fuel cell is presented. This represents the first attempt to model PEM fuel cells employing intermediate temperature membranes, in this case, phosphoric acid doped polybenzimidazole (PBI). To date, mathematical modeling of PEM fuel cells has been restricted to low temperature operation, especially to those employing Nafion ® membranes; while research on PBI as an intermediate temperature membrane has been solely at the experimental level. This work is an advancement in the state of the art of both these fields of research. With a growing trend toward higher temperature operation of PEM fuel cells, mathematical modeling of such systems is necessary to help hasten the development of the technology and highlight areas where research should be focused.^ This mathematical model accounted for all the major transport and polarization processes occurring inside the fuel cell, including the two phase phenomenon of gas dissolution in the polymer electrolyte. Results were presented for polarization performance, flux distributions, concentration variations in both the gaseous and aqueous phases, and temperature variations for various heat management strategies. The model predictions matched well with published experimental data, and were self-consistent.^ The major finding of this research was that, due to the transport limitations imposed by the use of phosphoric acid as a doping agent, namely low solubility and diffusivity of dissolved gases and anion adsorption onto catalyst sites, the catalyst utilization is very low (∼1--2%). Significant cost savings were predicted with the use of advanced catalyst deposition techniques that would greatly reduce the eventual thickness of the catalyst layer, and subsequently improve catalyst utilization. The model also predicted that an increase in power output in the order of 50% is expected if alternative doping agents to phosphoric acid can be found, which afford better transport properties of dissolved gases, reduced anion adsorption onto catalyst sites, and which maintain stability and conductive properties at elevated temperatures.^
Resumo:
Exchange rate economics has achieved substantial development in the past few decades. Despite extensive research, a large number of unresolved problems remain in the exchange rate debate. This dissertation studied three puzzling issues aiming to improve our understanding of exchange rate behavior. Chapter Two used advanced econometric techniques to model and forecast exchange rate dynamics. Chapter Three and Chapter Four studied issues related to exchange rates using the theory of New Open Economy Macroeconomics. ^ Chapter Two empirically examined the short-run forecastability of nominal exchange rates. It analyzed important empirical regularities in daily exchange rates. Through a series of hypothesis tests, a best-fitting fractionally integrated GARCH model with skewed student-t error distribution was identified. The forecasting performance of the model was compared with that of a random walk model. Results supported the contention that nominal exchange rates seem to be unpredictable over the short run in the sense that the best-fitting model cannot beat the random walk model in forecasting exchange rate movements. ^ Chapter Three assessed the ability of dynamic general-equilibrium sticky-price monetary models to generate volatile foreign exchange risk premia. It developed a tractable two-country model where agents face a cash-in-advance constraint and set prices to the local market; the exogenous money supply process exhibits time-varying volatility. The model yielded approximate closed form solutions for risk premia and real exchange rates. Numerical results provided quantitative evidence that volatile risk premia can endogenously arise in a new open economy macroeconomic model. Thus, the model had potential to rationalize the Uncovered Interest Parity Puzzle. ^ Chapter Four sought to resolve the consumption-real exchange rate anomaly, which refers to the inability of most international macro models to generate negative cross-correlations between real exchange rates and relative consumption across two countries as observed in the data. While maintaining the assumption of complete asset markets, this chapter introduced endogenously segmented asset markets into a dynamic sticky-price monetary model. Simulation results showed that such a model could replicate the stylized fact that real exchange rates tend to move in an opposite direction with respect to relative consumption. ^
Resumo:
The present investigation examined the relationships among personality (as conceptualized by the Big Five Factors), leader-member exchange (LMX) quality, action control, organizational citizenship behaviors (OCB), and overall job performance (OJP). Two mediator variables were proposed and tested in this study: LMX and Action Control. Two-hundred and seven currently employed regular elementary school classroom teachers provided data during the 2000–2001 academic school year. Teachers provided personality, LMX quality (member or subordinate perspective), action control, job tenure, and demographic data. Nine school administrators (i.e., Principals, Assistant Principals) were the source for supervisor ratings of OCB, OJP, and LMX quality (leader or supervisor perspective). In eight of the nine total schools, teachers completed questionnaires during an after-school teacher gathering; in the remaining school location questionnaires were dropped off, distributed to teachers, and re-collected two weeks later. Results indicated a significant relationship between the OCB scale and overall supervisory ratings of OJP. The relationship among the big five factors of personality and OJP did not reach statistical significance, nor did the relationships among personality and OCB. The data indicated that none of the teacher tenure variables (i.e., teacher, school, or time worked with principal tenure) moderated the personality-OCB relationship nor the personality-OJP relationship. Finally, a review of the correlations among the variables of interest precluded conducting a mediation between personality-performance by OCB, mediation of personality-OCB by action control, and mediation of personality-OCB by LMX. In conclusion, the data reveal that personality was not significantly correlated with supervisory ratings of OJP or significantly related to supervisory ratings of overall OCB. Moreover, LMX quality and action control did not mediate the relationships between Personality-OJP nor the Personality-OCB relationship. Significant relationships were found between disengagement and overall LMX quality and between Initiative and overall LMX quality (both LMX-Teacher perspectives) as well as between personality variables and both Disengagement and Initiative action control variables. Despite the limitations inherent in this study, these latter findings suggest “lessons” for teachers and school administrators alike. ^
Resumo:
The primary goal of this dissertation is the study of patterns of viral evolution inferred from serially-sampled sequence data, i.e., sequence data obtained from strains isolated at consecutive time points from a single patient or host. RNA viral populations have an extremely high genetic variability, largely due to their astronomical population sizes within host systems, high replication rate, and short generation time. It is this aspect of their evolution that demands special attention and a different approach when studying the evolutionary relationships of serially-sampled sequence data. New methods that analyze serially-sampled data were developed shortly after a groundbreaking HIV-1 study of several patients from which viruses were isolated at recurring intervals over a period of 10 or more years. These methods assume a tree-like evolutionary model, while many RNA viruses have the capacity to exchange genetic material with one another using a process called recombination. ^ A genealogy involving recombination is best described by a network structure. A more general approach was implemented in a new computational tool, Sliding MinPD, one that is mindful of the sampling times of the input sequences and that reconstructs the viral evolutionary relationships in the form of a network structure with implicit representations of recombination events. The underlying network organization reveals unique patterns of viral evolution and could help explain the emergence of disease-associated mutants and drug-resistant strains, with implications for patient prognosis and treatment strategies. In order to comprehensively test the developed methods and to carry out comparison studies with other methods, synthetic data sets are critical. Therefore, appropriate sequence generators were also developed to simulate the evolution of serially-sampled recombinant viruses, new and more through evaluation criteria for recombination detection methods were established, and three major comparison studies were performed. The newly developed tools were also applied to "real" HIV-1 sequence data and it was shown that the results represented within an evolutionary network structure can be interpreted in biologically meaningful ways. ^
Resumo:
The two-photon exchange phenomenon is believed to be responsible for the discrepancy observed between the ratio of proton electric and magnetic form factors, measured by the Rosenbluth and polarization transfer methods. This disagreement is about a factor of three at Q 2 of 5.6 GeV2. The precise knowledge of the proton form factors is of critical importance in understanding the structure of this nucleon. The theoretical models that estimate the size of the two-photon exchange (TPE) radiative correction are poorly constrained. This factor was found to be directly measurable by taking the ratio of the electron-proton and positron-proton elastic scattering cross sections, as the TPE effect changes sign with respect to the charge of the incident particle. A test run of a modified beamline has been conducted with the CEBAF Large Acceptance Spectrometer (CLAS) at Thomas Jefferson National Accelerator Facility. This test run demonstrated the feasibility of producing a mixed electron/positron beam of good quality. Extensive simulations performed prior to the run were used to reduce the background rate that limits the production luminosity. A 3.3 GeV primary electron beam was used that resulted in an average secondary lepton beam of 1 GeV. As a result, the elastic scattering data of both lepton types were obtained at scattering angles up to 40 degrees for Q2 up to 1.5 GeV2. The cross section ratio displayed an &epsis; dependence that was Q2 dependent at smaller Q2 limits. The magnitude of the average ratio as a function of &epsis; was consistent with the previous measurements, and the elastic (Blunden) model to within the experimental uncertainties. Ultimately, higher luminosity is needed to extend the data range to lower &epsis; where the TPE effect is predicted to be largest.
Resumo:
Background: Biologists often need to assess whether unfamiliar datasets warrant the time investment required for more detailed exploration. Basing such assessments on brief descriptions provided by data publishers is unwieldy for large datasets that contain insights dependent on specific scientific questions. Alternatively, using complex software systems for a preliminary analysis may be deemed as too time consuming in itself, especially for unfamiliar data types and formats. This may lead to wasted analysis time and discarding of potentially useful data. Results: We present an exploration of design opportunities that the Google Maps interface offers to biomedical data visualization. In particular, we focus on synergies between visualization techniques and Google Maps that facilitate the development of biological visualizations which have both low-overhead and sufficient expressivity to support the exploration of data at multiple scales. The methods we explore rely on displaying pre-rendered visualizations of biological data in browsers, with sparse yet powerful interactions, by using the Google Maps API. We structure our discussion around five visualizations: a gene co-regulation visualization, a heatmap viewer, a genome browser, a protein interaction network, and a planar visualization of white matter in the brain. Feedback from collaborative work with domain experts suggests that our Google Maps visualizations offer multiple, scale-dependent perspectives and can be particularly helpful for unfamiliar datasets due to their accessibility. We also find that users, particularly those less experienced with computer use, are attracted by the familiarity of the Google Maps API. Our five implementations introduce design elements that can benefit visualization developers. Conclusions: We describe a low-overhead approach that lets biologists access readily analyzed views of unfamiliar scientific datasets. We rely on pre-computed visualizations prepared by data experts, accompanied by sparse and intuitive interactions, and distributed via the familiar Google Maps framework. Our contributions are an evaluation demonstrating the validity and opportunities of this approach, a set of design guidelines benefiting those wanting to create such visualizations, and five concrete example visualizations.
Resumo:
This study evaluated three menu nutrition labeling formats: calorie only information, a healthy symbol, and a nutrient list. Daily sales data for a table-service restaurant located on a university campus were recorded during a four-week period from January to February 2013 to examine changes in average nutritional content of the entrees purchased by customers when different nutrition labels were provided. A survey was conducted to assess the customers’ use of nutrition labels, their preferences among the three labeling formats, their entree selections, their cognitive beliefs with regard to healthy eating, and their demographic characteristics. A total of 173 questionnaires were returned and included in data analysis. Analysis of Variance (ANOVA) and regression analyses were performed using SAS. The results showed that favorable attitudes toward healthy eating and the use of nutrition labels were both significantly associated with healthier entrée selections. Age and diet status had some effects on the respondent’s use of nutrition labels. The calorie only information format was the most effective in reducing calories contained in the entrees sold, and the nutrient list was most effective in reducing fat and saturated fat content of the entrees sold. The healthy symbol was the least effective format, but interestingly enough, was most preferred by respondents. The findings provide support for future research and offer implications for policy makers, public health professionals, and foodservice operations.
Resumo:
The electromagnetic form factors are the most fundamental observables that encode information about the internal structure of the nucleon. The electric (GE) and the magnetic ( GM) form factors contain information about the spatial distribution of the charge and magnetization inside the nucleon. A significant discrepancy exists between the Rosenbluth and the polarization transfer measurements of the electromagnetic form factors of the proton. One possible explanation for the discrepancy is the contributions of two-photon exchange (TPE) effects. Theoretical calculations estimating the magnitude of the TPE effect are highly model dependent, and limited experimental evidence for such effects exists. Experimentally, the TPE effect can be measured by comparing the ratio of positron-proton elastic scattering cross section to that of the electron-proton [R = σ(e +p)/σ(e+p)]. The ratio R was measured over a wide range of kinematics, utilizing a 5.6 GeV primary electron beam produced by the Continuous Electron Beam Accelerator Facility (CEBAF) at Jefferson Lab. This dissertation explored dependence of R on kinematic variables such as squared four-momentum transfer (Q2) and the virtual photon polarization parameter (&epsis;). A mixed electron-positron beam was produced from the primary electron beam in experimental Hall B. The mixed beam was scattered from a liquid hydrogen (LH2) target. Both the scattered lepton and the recoil proton were detected by the CEBAF Large Acceptance Spectrometer (CLAS). The elastic events were then identified by using elastic scattering kinematics. This work extracted the Q2 dependence of R at high &epsis;(&epsis; > 0.8) and the $&epsis; dependence of R at ⟨Q 2⟩ approx 0.85 GeV2. In these kinematics, our data confirm the validity of the hadronic calculations of the TPE effect by Blunden, Melnitchouk, and Tjon. This hadronic TPE effect, with additional corrections contributed by higher excitations of the intermediate state nucleon, largely reconciles the Rosenbluth and the polarization transfer measurements of the electromagnetic form factors.