940 resultados para Exchange algorithm


Relevância:

30.00% 30.00%

Publicador:

Resumo:

We study the equilibrium states of energy functions involving a large set of real variables, defined on the links of sparsely connected networks, and interacting at the network nodes, using the cavity and replica methods. When applied to the representative problem of network resource allocation, an efficient distributed algorithm is devised, with simulations showing full agreement with theory. Scaling properties with the network connectivity and the resource availability are found. © 2006 The American Physical Society.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In January 2001 Greece joined the eurozone. The aim of this article is to examine whether an intention to join the eurozone had any impact on exchange rate volatility. We apply the Iterated Cumulative Sum of Squares (ICSS) algorithm of Inclan and Tiao (1994) to a set of Greek drachma exchange rate changes. We find evidence to suggest that the unconditional volatility of the drachma exchange rate against the dollar, British pound, yen, German mark and ECU/Euro was nonstationary, exhibiting a large number of volatility changes prior to European Monetary Union (EMU) membership. We then use a news archive service to identify the events that might have caused exchange rate volatility to shift. We find that devaluation of the drachma increased exchange rate volatility but ERM membership and a commitment to joining the eurozone led to lower volatility. Our findings therefore suggest that a strong commitment to join the eurozone may be sufficient to reduce some exchange rate volatility which has implications for countries intending to join the eurozone in the future.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this thesis is to shed more light in the FX market microstructure by examining the determinants of bid-ask spread for three currencies pairs, the US dollar/Japanese yen, the British pound/US dollar and the Euro/US dollar in different time zones. I examine the commonality in liquidity with the elaboration of FX market microstructure variables in financial centres across the world (New York, London, Tokyo) based on the quotes of three exchange rate currency pairs over a ten-year period. I use GARCH (1,1) specifications, ICSS algorithm, and vector autoregression analysis to examine the effect of trading activity, exchange rate volatility and inventory holding costs on both quoted and relative spreads. ICSS algorithm results show that intraday spread series are much less volatile compared to the intraday exchange rate series as the number of change points obtained from ICSS algorithm is considerably lower. GARCH (1,1) estimation results of daily and intraday bid-ask spreads, show that the explanatory variables work better when I use higher frequency data (intraday results) however, their explanatory power is significantly lower compared to the results based on the daily sample. This suggests that although daily spreads and intraday spreads have some common determinants there are other factors that determine the behaviour of spreads at high frequencies. VAR results show that there are some differences in the behaviour of the variables at high frequencies compared to the results from the daily sample. A shock in the number of quote revisions has more effect on the spread when short term trading intervals are considered (intra-day) compared to its own shocks. When longer trading intervals are considered (daily) then the shocks in the spread have more effect on the future spread. In other words, trading activity is more informative about the future spread when intra-day trading is considered while past spread is more informative about the future spread when daily trading is considered

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this paper is to examine the short term dynamics of foreign exchange rate spreads. Using a vector autoregressive model (VAR) we show that most of the variation in the spread comes from the long run dependencies between past and future spreads rather than being caused by changes in inventory, adverse selection, cost of carry or order processing costs. We apply the Integrated Cumulative Sum of Squares (ICSS) algorithm of Inclan and Tiao (1994) to discover how often spread volatility changes. We find that spread volatility shifts are relatively uncommon and shifts in one currency spread tend not to spillover to other currency spreads. © 2013.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Supply chain formation (SCF) is the process of determining the set of participants and exchange relationships within a network with the goal of setting up a supply chain that meets some predefined social objective. Many proposed solutions for the SCF problem rely on centralized computation, which presents a single point of failure and can also lead to problems with scalability. Decentralized techniques that aid supply chain emergence offer a more robust and scalable approach by allowing participants to deliberate between themselves about the structure of the optimal supply chain. Current decentralized supply chain emergence mechanisms are only able to deal with simplistic scenarios in which goods are produced and traded in single units only and without taking into account production capacities or input-output ratios other than 1:1. In this paper, we demonstrate the performance of a graphical inference technique, max-sum loopy belief propagation (LBP), in a complex multiunit unit supply chain emergence scenario which models additional constraints such as production capacities and input-to-output ratios. We also provide results demonstrating the performance of LBP in dynamic environments, where the properties and composition of participants are altered as the algorithm is running. Our results suggest that max-sum LBP produces consistently strong solutions on a variety of network structures in a multiunit problem scenario, and that performance tends not to be affected by on-the-fly changes to the properties or composition of participants.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Az árhatásfüggvények azt mutatják meg, hogy egy adott értékű megbízás mekkora relatív árváltozást okoz. Az árhatásfüggvény ismerete a piaci szereplők számára fontos szerepet játszik a jövőben benyújtandó ajánlataikhoz kapcsolódó árhatás előrejelzésében, a kereskedés árváltozásból eredő többletköltségének becslésében, illetve az optimális kereskedési algoritmus kialakításában. Az általunk kidolgozott módszer révén a piaci szereplők a teljes ajánlati könyv ismerete nélkül egyszerűen és gyorsan tudnak virtuális árhatásfüggvényt meghatározni, ugyanis bemutatjuk az árhatásfüggvény és a likviditási mértékek kapcsolatát, valamint azt, hogy miként lehet a Budapesti Likviditási Mérték (BLM) idősorából ár ha tás függ vényt becsülni. A kidolgozott módszertant az OTP-részvény idősorán szemléltetjük, és a részvény BLM-adatsorából a 2007. január 1-je és 2011. június 3-a közötti időszakra virtuális árhatás függvényt becsülünk. Empirikus elemzésünk során az árhatás függ vény időbeli alakulásának és alapvető statisztikai tulajdonságainak vizsgálatát végezzük el, ami révén képet kaphatunk a likviditás hiányában fellépő tranzakciós költségek múltbeli viselkedéséről. Az így kapott információk például a dinamikus portfólióoptimalizálás során lehetnek a kereskedők segítségére. / === / Price-effect equations show what relative price change a commission of a given value will have. Knowledge of price-effect equations plays an important part in enabling market players to predict the price effect of their future commissions and to develop an optimal trading algorithm. The method devised by the authors allows a virtual price-effect equation to be defined simply and rapidly without knowledge of the whole offer book, by presenting the relation between the price-effect equation and degree of liquidity, and how to estimate the price-effect equation from the time line of the Budapest Liquidity Measure (BLM). The methodology is shown using the time line for OTP shares and the virtual price-effect equation estimated for the 1 January 2007 to 3 June 2011 period from the shares BML data set. During the empirical analysis the authors conducted an examination of the tendency of the price-effect equation over time and for its basic statistical attributes, to yield a picture of the past behaviour of the transaction costs arising in the absence of liquidity. The information obtained may, for instance, help traders in dynamic portfolio optimization.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Lipidic mixtures present a particular phase change profile highly affected by their unique crystalline structure. However, classical solid-liquid equilibrium (SLE) thermodynamic modeling approaches, which assume the solid phase to be a pure component, sometimes fail in the correct description of the phase behavior. In addition, their inability increases with the complexity of the system. To overcome some of these problems, this study describes a new procedure to depict the SLE of fatty binary mixtures presenting solid solutions, namely the Crystal-T algorithm. Considering the non-ideality of both liquid and solid phases, this algorithm is aimed at the determination of the temperature in which the first and last crystal of the mixture melts. The evaluation is focused on experimental data measured and reported in this work for systems composed of triacylglycerols and fatty alcohols. The liquidus and solidus lines of the SLE phase diagrams were described by using excess Gibbs energy based equations, and the group contribution UNIFAC model for the calculation of the activity coefficients of both liquid and solid phases. Very low deviations of theoretical and experimental data evidenced the strength of the algorithm, contributing to the enlargement of the scope of the SLE modeling.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE: To compare the Full Threshold (FT) and SITA Standard (SS) strategies in glaucomatous patients undergoing automated perimetry for the first time. METHODS: Thirty-one glaucomatous patients who had never undergone perimetry underwent automated perimetry (Humphrey, program 30-2) with both FT and SS on the same day, with an interval of at least 15 minutes. The order of the examination was randomized, and only one eye per patient was analyzed. Three analyses were performed: a) all the examinations, regardless of the order of application; b) only the first examinations; c) only the second examinations. In order to calculate the sensitivity of both strategies, the following criteria were used to define abnormality: glaucoma hemifield test (GHT) outside normal limits, pattern standard deviation (PSD) <5%, or a cluster of 3 adjacent points with p<5% at the pattern deviation probability plot. RESULTS: When the results of all examinations were analyzed regardless of the order in which they were performed, the number of depressed points with p<0.5% in the pattern deviation probability map was significantly greater with SS (p=0.037), and the sensitivities were 87.1% for SS and 77.4% for FT (p=0.506). When only the first examinations were compared, there were no statistically significant differences regarding the number of depressed points, but the sensitivity of SS (100%) was significantly greater than that obtained with FT (70.6%) (p=0.048). When only the second examinations were compared, there were no statistically significant differences regarding the number of depressed points, and the sensitivities of SS (76.5%) and FT (85.7%) (p=0.664). CONCLUSION: SS may have a higher sensitivity than FT in glaucomatous patients undergoing automated perimetry for the first time. However, this difference tends to disappear in subsequent examinations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The transmetalation between boron and zinc is of great importance for application in organic synthesis, since it allows the formation of new carbon-carbon bonds between organometallic units and electrophiles. The direct arylation of aldehydes or more scarcely ketones, in a catalytic, enantioselective manner using chiral catalysts has been described recently. The enantiomerically enriched diarylmethanols obtained in these reactions are valuable precursors for important bioactive molecules. This review provides a synopsis of this ever-growing field and highlights some of the challenges that still remain.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A homoenolate generated by tellurium/lithium exchange reaction was employed in a straightforward enantioselective synthesis of (+)-endo-brevicomin in 70% yield and 84.4% e.e.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The network of HIV counseling and testing centers in São Paulo, Brazil is a major source of data used to build epidemiological profiles of the client population. We examined HIV-1 incidence from November 2000 to April 2001, comparing epidemiological and socio-behavioral data of recently-infected individuals with those with long-standing infection. A less sensitive ELISA was employed to identify recent infection. The overall incidence of HIV-1 infection was 0.53/100/year (95% CI: 0.31-0.85/100/year): 0.77/100/year for males (95% CI: 0.42-1.27/100/year) and 0.22/100/ year (95% CI: 0.05-0.59/100/year) for females. Overall HIV-1 prevalence was 3.2% (95% CI: 2.8-3.7%), being 4.0% among males (95% CI: 3.3-4.7%) and 2.1% among females (95% CI: 1.6-2.8%). Recent infections accounted for 15% of the total (95% CI: 10.2-20.8%). Recent infection correlated with being younger and male (p = 0.019). Therefore, recent infection was more common among younger males and older females.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work develops a method for solving ordinary differential equations, that is, initial-value problems, with solutions approximated by using Legendre's polynomials. An iterative procedure for the adjustment of the polynomial coefficients is developed, based on the genetic algorithm. This procedure is applied to several examples providing comparisons between its results and the best polynomial fitting when numerical solutions by the traditional Runge-Kutta or Adams methods are available. The resulting algorithm provides reliable solutions even if the numerical solutions are not available, that is, when the mass matrix is singular or the equation produces unstable running processes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a new statistical algorithm to estimate rainfall over the Amazon Basin region using the Tropical Rainfall Measuring Mission (TRMM) Microwave Imager (TMI). The algorithm relies on empirical relationships derived for different raining-type systems between coincident measurements of surface rainfall rate and 85-GHz polarization-corrected brightness temperature as observed by the precipitation radar (PR) and TMI on board the TRMM satellite. The scheme includes rain/no-rain area delineation (screening) and system-type classification routines for rain retrieval. The algorithm is validated against independent measurements of the TRMM-PR and S-band dual-polarization Doppler radar (S-Pol) surface rainfall data for two different periods. Moreover, the performance of this rainfall estimation technique is evaluated against well-known methods, namely, the TRMM-2A12 [ the Goddard profiling algorithm (GPROF)], the Goddard scattering algorithm (GSCAT), and the National Environmental Satellite, Data, and Information Service (NESDIS) algorithms. The proposed algorithm shows a normalized bias of approximately 23% for both PR and S-Pol ground truth datasets and a mean error of 0.244 mm h(-1) ( PR) and -0.157 mm h(-1)(S-Pol). For rain volume estimates using PR as reference, a correlation coefficient of 0.939 and a normalized bias of 0.039 were found. With respect to rainfall distributions and rain area comparisons, the results showed that the formulation proposed is efficient and compatible with the physics and dynamics of the observed systems over the area of interest. The performance of the other algorithms showed that GSCAT presented low normalized bias for rain areas and rain volume [0.346 ( PR) and 0.361 (S-Pol)], and GPROF showed rainfall distribution similar to that of the PR and S-Pol but with a bimodal distribution. Last, the five algorithms were evaluated during the TRMM-Large-Scale Biosphere-Atmosphere Experiment in Amazonia (LBA) 1999 field campaign to verify the precipitation characteristics observed during the easterly and westerly Amazon wind flow regimes. The proposed algorithm presented a cumulative rainfall distribution similar to the observations during the easterly regime, but it underestimated for the westerly period for rainfall rates above 5 mm h(-1). NESDIS(1) overestimated for both wind regimes but presented the best westerly representation. NESDIS(2), GSCAT, and GPROF underestimated in both regimes, but GPROF was closer to the observations during the easterly flow.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Context. B[e] supergiants are luminous, massive post-main sequence stars exhibiting non-spherical winds, forbidden lines, and hot dust in a disc-like structure. The physical properties of their rich and complex circumstellar environment (CSE) are not well understood, partly because these CSE cannot be easily resolved at the large distances found for B[e] supergiants (typically greater than or similar to 1 kpc). Aims. From mid-IR spectro-interferometric observations obtained with VLTI/MIDI we seek to resolve and study the CSE of the Galactic B[e] supergiant CPD-57 degrees 2874. Methods. For a physical interpretation of the observables (visibilities and spectrum) we use our ray-tracing radiative transfer code (FRACS), which is optimised for thermal spectro-interferometric observations. Results. Thanks to the short computing time required by FRACS (<10 s per monochromatic model), best-fit parameters and uncertainties for several physical quantities of CPD-57 degrees 2874 were obtained, such as inner dust radius, relative flux contribution of the central source and of the dusty CSE, dust temperature profile, and disc inclination. Conclusions. The analysis of VLTI/MIDI data with FRACS allowed one of the first direct determinations of physical parameters of the dusty CSE of a B[e] supergiant based on interferometric data and using a full model-fitting approach. In a larger context, the study of B[e] supergiants is important for a deeper understanding of the complex structure and evolution of hot, massive stars.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There exist uniquely ergodic affine interval exchange transformations of [0,1] with flips which have wandering intervals and are such that the support of the invariant measure is a Cantor set.