953 resultados para Information exchange
Resumo:
In this article we develop a theoretical microstructure model of coordinated central bank intervention based on asymmetric information. We study the economic implications of coordination on some measures of market quality and show that the model predicts higher volatility and more significant exchange rate changes when central banks coordinate compared to when they intervene unilaterally. Both these predictions are in line with empirical evidence. Keywords: coordinated foreign exchange intervention, market microstructure. JEL Classification: D82, E58, F31, G14
Resumo:
This paper examines the effect that heterogeneous customer orders flows have on exchange rates by using a new, and the largest, proprietary dataset of weekly net order flow segmented by customer type across nine of the most liquid currency pairs. We make several contributions. Firstly, we investigate the extent to which customer order flow can help to explain exchange rate movements over and above the influence of macroeconomic variables. Secondly, we address the issue of whether order flows contain (private) information which explain exchange rates changes. Thirdly, we look at the usefulness of order flow in forecasting exchange rate movements at longer horizons than those generally considered in the microstructure literature. Finally we address the question of whether the out-of-sample exchange rate forecasts generated by order flows can be employed profitably in the foreign exchange markets
Resumo:
This paper presents a theoretical framework analysing the signalling channel of exchange rate interventions as an informational trigger. We develop an implicit target zone framework with learning in order to model the signalling channel. The theoretical premise of the model is that interventions convey signals that communicate information about the exchange rate objectives of central bank. The model is used to analyse the impact of Japanese FX interventions during the period 1999 -2011 on the yen/US dollar dynamics.
Resumo:
This paper presents a theoretical framework analysing the signalling channel of exchange rate interventions as an informational trigger. We develop an implicit target zone framework with learning in order to model the signalling channel. The theoretical premise of the model is that interventions convey signals that communicate information about the exchange rate objectives of central bank. The model is used to analyse the impact of Japanese FX interventions during the period 1999 -2011 on the yen/US dollar dynamics.
Resumo:
The HUPO Proteomics Standards Initiative has developed several standardized data formats to facilitate data sharing in mass spectrometry (MS)-based proteomics. These allow researchers to report their complete results in a unified way. However, at present, there is no format to describe the final qualitative and quantitative results for proteomics and metabolomics experiments in a simple tabular format. Many downstream analysis use cases are only concerned with the final results of an experiment and require an easily accessible format, compatible with tools such as Microsoft Excel or R. We developed the mzTab file format for MS-based proteomics and metabolomics results to meet this need. mzTab is intended as a lightweight supplement to the existing standard XML-based file formats (mzML, mzIdentML, mzQuantML), providing a comprehensive summary, similar in concept to the supplemental material of a scientific publication. mzTab files can contain protein, peptide, and small molecule identifications together with experimental metadata and basic quantitative information. The format is not intended to store the complete experimental evidence but provides mechanisms to report results at different levels of detail. These range from a simple summary of the final results to a representation of the results including the experimental design. This format is ideally suited to make MS-based proteomics and metabolomics results available to a wider biological community outside the field of MS. Several software tools for proteomics and metabolomics have already adapted the format as an output format. The comprehensive mzTab specification document and extensive additional documentation can be found online.
Resumo:
En el nostre projecte, considerem un escenari urbà o interurbà on persones amb dispositius mòbils (smartphones) o vehicles equipats amb interfícies de comunicació, estan interessats en compartir fitxers entre ells o descarregar-los al creuar Punts d’Accés (APs) propers a la carretera. Estudiem la possibilitat d’utilizar la cooperació en les trobades casuals entre nodes per augmentar la velocitat de descàrrega global. Amb aquest objectiu, plantejem algoritmes per a la selecció de quins paquets, per a quins destins i quins transportistes s’escullen en cada moment. Mitjançant extenses simulacions, mostrem com les cooperacions carry&forward dels nodes augmenten significativament la velocitat de descàrrega dels usuaris, i com aquest resultat es manté per a diversos patrons de mobilitat, col•locacions d'AP i càrregues de la xarxa. Per altra banda, aparells com els smartphones, on la targeta de WiFi està encesa contínuament, consumeixen l'energia de la bateria en poques hores. En molts escenaris, una targeta WiFi sempre activa és poc útil, perque sovint no hi ha necessitat de transmissió o recepció. Aquest fet es veu agreujat en les Delay Tolerant Networks (DTN), on els nodes intercanvien dades quan es creuen i en tenen l’oportunitat. Les tècniques de gestió de l’estalvi d’energia permeten extendre la duració de les bateries. El nostre projecte analitza els avantatges i inconvenients que apareixen quan els nodes apaguen períodicament la seva targeta wireless per a estalviar energia en escenaris DTN. Els nostres resultats mostren les condicions en que un node pot desconnectar la bateria sense afectar la probabilitat de contacte amb altres nodes, i les condicions en que aquesta disminueix. Per exemple, es demostra que la vida del node pot ser duplicada mantenint la probabilitat de contacte a 1. I que aquesta disminueix ràpidament en intentar augmentar més la vida útil.
Resumo:
Objectives: The aim of this study was to compare specificity and sensitivity of different biological markers that can be used in a forensic field to identify potentially dangerous drivers because of their alcohol habits. Methods: We studied 280 Swiss drivers after driving while under the alcohol influence. 33 were excluded for not having CDT N results, 247 were included (218 men (88%) and 29 women (12%). Mean age was 42,4 (SD:12, min: 20 max: 76). The evaluation of the alcohol consumption concerned the month before the CDT test and was considered as such after the interview: Heavy drinkers (>3 drinks per day): 60 (32.7%), < 3 drinks per day and moderate: 127 (51.4%) 114 (46.5%), abstinent: 60 (24.3%) 51 (21%). Alcohol intake was monitored by structured interviews, self-reported drinking habits and the C-Audit questionnaire as well as information provided by their family and general practitioner. Consumption was quantified in terms of standard drinks, which contain approximately 10 grams of pure alcohol (Ref. WHO). Results: comparison between moderate (less or equal to 3 drinks per day) and excessive drinkers (more than 3 drinks) Marker ROC area 95% CI cut-off sensitivity specificity CDT TIA 0.852 0.786-0917 2.6* 0.93 LR+1.43 0.35 LR-0.192 CDT N latex 0.875 0.821-0.930 2.5* 0.66 LR+ 6.93 0.90 LR- 0.369 Asialo+disialo-tf 0.881 0.826-0.936 1.2* 0.78 LR+4.07 0.80 LR-0.268 1.7° 0.66 LR+8.9 0.93 LR-0.360 GGT 0.659 0.580-0.737 85* 0.37 LR+2.14 0.83 LR-0.764 * cut-off point suggested by the manufacturer ° cut-off point suggested by our laboratory Conclusion: With the cut-off point established by the manufacturer, CDT TIA performed poorly in term of specificity. N latex CDT and CZE CDT were better, especially if a 1.7 cut-off is used with CZE
Resumo:
\documentstyle[portada,11pt]{article}This paper shows that the presence of private information in aneconomy can be a source of market incompleteness even when it is feasibleto issue a set of securities that completely eliminates the informationalasymmetries in equilibrium. We analyze a simple security design model in which avolume maximizing futures exchange chooses not only the characteristics ofeach individual contract but also the number of contracts. Agents have rationalexpectations and differ in information, endowments and, possibly, attitudestoward risk. The emergence of complete or incomplete markets in equilibriumdepends on whether the {\it adverse selection effect} is stronger or weakerthan the {\it Hirshleifer effect}, as new securitiesare issued and prices reveal more information. When the Hirshleifer effectdominates, the exchange chooses an incomplete set of financial contracts, andthe equilibrium price is partially revealing.
Resumo:
BACKGROUND: Increasingly, patients receiving methadone treatment are found in low threshold facilities (LTF), which provide needle exchange programmes in Switzerland. This paper identifies the characteristics of LTF attendees receiving methadone treatment (MT) compared with other LTF attendees (non-MT). METHODS: A national cross-sectional survey was conducted in 2006 over five consecutive days in all LTF (n=25). Attendees were given an anonymous questionnaire, collecting information on socio-demographic indicators, drug consumption, injection, methadone treatment, and self-reported HIV and HCV status. Univariate analysis and logistic regression were performed to compare MT to non-MT. The response rate was 66% (n=1128). RESULTS: MT comprised 57.6% of the sample. In multivariate analysis, factors associated with being on MT were older age (OR: 1.38), being female (OR: 1.60), having one's own accommodation (OR: 1.56), receiving public assistance (OR: 2.29), lifetime injecting (OR: 2.26), HIV-positive status (OR: 2.00), and having consumed cocaine during the past month (OR: 1.37); MT were less likely to have consumed heroin in the past month (OR: 0.76, not significant) and visited LTF less often on a daily basis (OR: 0.59). The number of injections during the past week was not associated with MT. CONCLUSIONS: More LTF attendees were in the MT group, bringing to light an underappreciated LTF clientele with specific needs. The MT group consumption profile may reflect therapeutic failure or deficits in treatment quality and it is necessary to acknowledge this and to strengthen the awareness of LTF personnel about potential needs of MT attendees to meet their therapeutic goals.
Resumo:
Exchange matrices represent spatial weights as symmetric probability distributions on pairs of regions, whose margins yield regional weights, generally well-specified and known in most contexts. This contribution proposes a mechanism for constructing exchange matrices, derived from quite general symmetric proximity matrices, in such a way that the margin of the exchange matrix coincides with the regional weights. Exchange matrices generate in turn diffusive squared Euclidean dissimilarities, measuring spatial remoteness between pairs of regions. Unweighted and weighted spatial frameworks are reviewed and compared, regarding in particular their impact on permutation and normal tests of spatial autocorrelation. Applications include tests of spatial autocorrelation with diagonal weights, factorial visualization of the network of regions, multivariate generalizations of Moran's I, as well as "landscape clustering", aimed at creating regional aggregates both spatially contiguous and endowed with similar features.
Resumo:
Abstract The solvability of the problem of fair exchange in a synchronous system subject to Byzantine failures is investigated in this work. The fair exchange problem arises when a group of processes are required to exchange digital items in a fair manner, which means that either each process obtains the item it was expecting or no process obtains any information on, the inputs of others. After introducing a novel specification of fair exchange that clearly separates safety and liveness, we give an overview of the difficulty of solving such a problem in the context of a fully-connected topology. On one hand, we show that no solution to fair exchange exists in the absence of an identified process that every process can trust a priori; on the other, a well-known solution to fair exchange relying on a trusted third party is recalled. These two results lead us to complete our system model with a flexible representation of the notion of trust. We then show that fair exchange is solvable if and only if a connectivity condition, named the reachable majority condition, is satisfied. The necessity of the condition is proven by an impossibility result and its sufficiency by presenting a general solution to fair exchange relying on a set of trusted processes. The focus is then turned towards a specific network topology in order to provide a fully decentralized, yet realistic, solution to fair exchange. The general solution mentioned above is optimized by reducing the computational load assumed by trusted processes as far as possible. Accordingly, our fair exchange protocol relies on trusted tamperproof modules that have limited communication abilities and are only required in key steps of the algorithm. This modular solution is then implemented in the context of a pedagogical application developed for illustrating and apprehending the complexity of fair exchange. This application, which also includes the implementation of a wide range of Byzantine behaviors, allows executions of the algorithm to be set up and monitored through a graphical display. Surprisingly, some of our results on fair exchange seem contradictory with those found in the literature of secure multiparty computation, a problem from the field of modern cryptography, although the two problems have much in common. Both problems are closely related to the notion of trusted third party, but their approaches and descriptions differ greatly. By introducing a common specification framework, a comparison is proposed in order to clarify their differences and the possible origins of the confusion between them. This leads us to introduce the problem of generalized fair computation, a generalization of fair exchange. Finally, a solution to this new problem is given by generalizing our modular solution to fair exchange
Resumo:
The patent system was created for the purpose of promoting innovation by granting the inventors a legally defined right to exclude others in return for public disclosure. Today, patents are being applied and granted in greater numbers than ever, particularly in new areas such as biotechnology and information andcommunications technology (ICT), in which research and development (R&D) investments are also high. At the same time, the patent system has been heavily criticized. It has been claimed that it discourages rather than encourages the introduction of new products and processes, particularly in areas that develop quickly, lack one-product-one-patent correlation, and in which theemergence of patent thickets is characteristic. A further concern, which is particularly acute in the U.S., is the granting of so-called 'bad patents', i.e. patents that do not factually fulfil the patentability criteria. From the perspective of technology-intensive companies, patents could,irrespective of the above, be described as the most significant intellectual property right (IPR), having the potential of being used to protect products and processes from imitation, to limit competitors' freedom-to-operate, to provide such freedom to the company in question, and to exchange ideas with others. In fact, patents define the boundaries of ownership in relation to certain technologies. They may be sold or licensed on their ownor they may be components of all sorts of technology acquisition and licensing arrangements. Moreover, with the possibility of patenting business-method inventions in the U.S., patents are becoming increasingly important for companies basing their businesses on services. The value of patents is dependent on the value of the invention it claims, and how it is commercialized. Thus, most of them are worth very little, and most inventions are not worth patenting: it may be possible to protect them in other ways, and the costs of protection may exceed the benefits. Moreover, instead of making all inventions proprietary and seeking to appropriate as highreturns on investments as possible through patent enforcement, it is sometimes better to allow some of them to be disseminated freely in order to maximize market penetration. In fact, the ideology of openness is well established in the software sector, which has been the breeding ground for the open-source movement, for instance. Furthermore, industries, such as ICT, that benefit from network effects do not shun the idea of setting open standards or opening up their proprietary interfaces to allow everyone todesign products and services that are interoperable with theirs. The problem is that even though patents do not, strictly speaking, prevent access to protected technologies, they have the potential of doing so, and conflicts of interest are not rare. The primary aim of this dissertation is to increase understanding of the dynamics and controversies of the U.S. and European patent systems, with the focus on the ICT sector. The study consists of three parts. The first part introduces the research topic and the overall results of the dissertation. The second part comprises a publication in which academic, political, legal and business developments that concern software and business-method patents are investigated, and contentiousareas are identified. The third part examines the problems with patents and open standards both of which carry significant economic weight inthe ICT sector. Here, the focus is on so-called submarine patents, i.e. patentsthat remain unnoticed during the standardization process and then emerge after the standard has been set. The factors that contribute to the problems are documented and the practical and juridical options for alleviating them are assessed. In total, the dissertation provides a good overview of the challenges and pressures for change the patent system is facing,and of how these challenges are reflected in standard setting.
Resumo:
TeliaSoneran älykkään viestintäjärjestelmän kehitysluonnoksella (SME) pilotoidaan prototyyppipalveluita, joiden avulla asiakkaat voivat välittää viestejä matkapuhelimilla sekä tietokoneilla. SME:n peruspalveluita voidaan käyttää SIP-standardin mukaisilla asiakasohjelmilla sekä SME:n omilla WAP- ja WWW-käyttöliittymillä. Käyttäjät voivat nähdä toistensa tilatiedon, muuttaa omaa tilatietoaan sekä lähettää SIP-pikaviestejä, sähköpostiviestejä ja tekstiviestejä. Käyttäjät voivat myös ylläpitää listaa yhteyshenkilöistään, vastaanottaa pikaviestejä ja selata vastaanotettuja viestejä. Diplomityössä käsitellään yleisesti SME-järjestelmän rakennetta ja paneudutaan tutkimaan työssä toteutetun SME:n WWW-asiakasohjelman toteutusta. Diplomityössä käydään läpi projektiin liittyviä standardeja, suosituksia, toteustekniikoita sekä palveluita. Lisäksi tarkastellaan työssä hyödynnettyjä ohjelmointirajapintoja, nykyisiä älypuhelimia sekä niiden Internet-selaimia, jotka rajoittavat WWW-asiakaspalvelun toteutuksessa käytettyjä toteutustekniikkavaihtoehtoja. Lopuksi esitellään toteutettujen ohjelmistojen sisäistä rakennetta ja toimintaa.
Resumo:
The Catalan Research Portal (Portal de la Recerca de Catalunya or PRC) is an initiative carried out by the Consortium for University Services in Catalonia (CSUC) in coordination with nearly all universities in Catalonia. The Portal will provide an online CERIF-compliant collection of all research outputs produced by Catalan HEIs together with an appropriate contextual information describing the specific environment where the output was generated (such as researchers, research group, research project, etc). The initial emphasis of the Catalan Research Portal approach to research outputs will be made on publications, but other outputs such as patents and eventually research data will eventually be addressed as well. These guidelines provide information for PRC data providers to expose and exchange their research information metadata in CERIFXML compatible structure, thus allowing them not just to exchange validated CERIF XML data with the PRC platform, but to improve their general interoperability by being able to deliver CERIFcompatible outputs.