873 resultados para Optimum Frequency


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The purpose of this article is to contribute to the discussion of the financial aspects of dollarization and optimum currency areas. Based on the model of self-fulfilling debt crisis developed by Cole and Kehoe [4], it is possible to evaluate the comparative welfare of economies, which either keep their local currency and an independent monetary policy, join a monetary union or adopt dollarization. In the two former monetary regimes, governments can issue debt denominated, respectively, in local and common currencies, which is completely purchased by national consumers. Given this ability, governments may decide to impose an inflation tax on these assets and use the revenues so collected to avoid an external debt crises. While the country that issues its own currency takes this decision independently, a country belonging to a monetary union depends on the joint decision of all member countries about the common monetary policy. In this way, an external debt crises may be avoided under the local and common currency regimes, if, respectively, the national and the union central banks have the ability to do monetary policy, represented by the reduction in the real return on the bonds denominated in these currencies. This resource is not available under dollarization. In a dollarized economy, the loss of control over national monetary policy does not allow adjustments for exogenous shocks that asymmetrically affect the client and the anchor countries, but credibility is strengthened. On the other hand, given the ability to inflate the local currency, the central bank may be subject to the political influence of a government not so strongly concerned with fiscal discipline, which reduces the welfare of the economy. In a similar fashion, under a common currency regime, the union central bank may also be under the influence of a group of countries to inflate the common currency, even though they do not face external restrictions. Therefore, the local and common currencies could be viewed as a way to provide welfare enhancing bankruptcy, if it is not abused. With these peculiarities of monetary regimes in mind, we simulate the levels of economic welfare for each, employing recent data for the Brazilian economy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper develops a framework to test whether discrete-valued irregularly-spaced financial transactions data follow a subordinated Markov process. For that purpose, we consider a specific optional sampling in which a continuous-time Markov process is observed only when it crosses some discrete level. This framework is convenient for it accommodates not only the irregular spacing of transactions data, but also price discreteness. Further, it turns out that, under such an observation rule, the current price duration is independent of previous price durations given the current price realization. A simple nonparametric test then follows by examining whether this conditional independence property holds. Finally, we investigate whether or not bid-ask spreads follow Markov processes using transactions data from the New York Stock Exchange. The motivation lies on the fact that asymmetric information models of market microstructures predict that the Markov property does not hold for the bid-ask spread. The results are mixed in the sense that the Markov assumption is rejected for three out of the five stocks we have analyzed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As operações de alta frequência (High-Frequency Trading - HFT) estão crescendo cada vez mais na BOVESPA (Bolsa de Valores de São Paulo), porém seu volume ainda se encontra muito atrás do volume de operações similares realizadas em outras bolsas de relevância internacional. Este trabalho pretende criar oportunidades para futuras aplicações e pesquisas nesta área. Visando aplicações práticas, este trabalho foca na aplicação de um modelo que rege a dinâmica do livro de ordens a dados do mercado brasileiro. Tal modelo é construído com base em informações do próprio livro de ordens, apenas. Depois de construído o modelo, o mesmo é utilizado em uma simulação de uma estratégia de arbitragem estatística de alta frequência. A base de dados utilizada para a realização deste trabalho é constituída pelas ordens lançadas na BOVESPA para a ação PETR4.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aiming at empirical findings, this work focuses on applying the HEAVY model for daily volatility with financial data from the Brazilian market. Quite similar to GARCH, this model seeks to harness high frequency data in order to achieve its objectives. Four variations of it were then implemented and their fit compared to GARCH equivalents, using metrics present in the literature. Results suggest that, in such a market, HEAVY does seem to specify daily volatility better, but not necessarily produces better predictions for it, what is, normally, the ultimate goal. The dataset used in this work consists of intraday trades of U.S. Dollar and Ibovespa future contracts from BM&FBovespa.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Real exchange rate is an important macroeconomic price in the economy and a ects economic activity, interest rates, domestic prices, trade and investiments ows among other variables. Methodologies have been developed in empirical exchange rate misalignment studies to evaluate whether a real e ective exchange is overvalued or undervalued. There is a vast body of literature on the determinants of long-term real exchange rates and on empirical strategies to implement the equilibrium norms obtained from theoretical models. This study seeks to contribute to this literature by showing that it is possible to calculate the misalignment from a mixed ointegrated vector error correction framework. An empirical exercise using United States' real exchange rate data is performed. The results suggest that the model with mixed frequency data is preferred to the models with same frequency variables

Relevância:

20.00% 20.00%

Publicador:

Resumo:

O Mercado Acionário Americano evoluiu rapidamente na última década. Este tornou-se uma arquitetura aberta em que participantes com tecnologia inovadora podem competir de forma eficaz. Várias mudanças regulatórias e inovações tecnológicas permitiram mudanças profundas na estrutura do mercado. Essas mudanças, junto com o desenvolvimento tecnológico de redes de alta velocidade, agiu como um catalisador, dando origem a uma nova forma de negociação, denominada Negociação em Alta Frequência (HFT). As empresas de HFT surgiram e se apropriaram em larga escala do negócio de formação de mercado, no fornecimento de liquidez. Embora HFT tem crescido massivamente, ao longo dos últimos quatro anos, HFT perdeu rentabilidade significativamente, uma vez que mais empresas aderiram ao setor reduzindo as margens. Portanto, diante deste contexto, esta tese buscou apresentar uma breve revisão sobre a atividade de HFT, seguida de uma análise dos limites deste setor, bem como, das características do macroambiente do HFT. Para tanto, a tese realizou uma extensa revisão do histórico literário, documentos públicos qualitativos, tais como, jornais, atas de reunião e relatórios oficiais. A tese empregou um ferramental de análise, Barreiras de Entrada e Mobilidade (Porter, 1980); Modelos de Evolução Setorial (McGahan, 2004); Estrutura do Setor de Informação Intensiva (Sampler, 1998), para analisar os limites do setor de HFT. Adicionalmente, empregou as ferramentas de análise, Modelos de Evolução Setorial (McGahan, 2004) e PESTEL (JOHNSON, SCHOLES, and WHITTINGTON, 2011), para analisar o setor e o contexto que envolve o negócio de HFT. A análise concluiu que as empresas que empregam HFT para atuar e competir no mercado acionário, compoem um setor independente.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

TORT, A. B. L. ; SCHEFFER-TEIXEIRA, R ; Souza, B.C. ; DRAGUHN, A. ; BRANKACK, J. . Theta-associated high-frequency oscillations (110-160 Hz) in the hippocampus and neocortex. Progress in Neurobiology , v. 100, p. 1-14, 2013.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The skin cancer is the most common of all cancers and the increase of its incidence must, in part, caused by the behavior of the people in relation to the exposition to the sun. In Brazil, the non-melanoma skin cancer is the most incident in the majority of the regions. The dermatoscopy and videodermatoscopy are the main types of examinations for the diagnosis of dermatological illnesses of the skin. The field that involves the use of computational tools to help or follow medical diagnosis in dermatological injuries is seen as very recent. Some methods had been proposed for automatic classification of pathology of the skin using images. The present work has the objective to present a new intelligent methodology for analysis and classification of skin cancer images, based on the techniques of digital processing of images for extraction of color characteristics, forms and texture, using Wavelet Packet Transform (WPT) and learning techniques called Support Vector Machine (SVM). The Wavelet Packet Transform is applied for extraction of texture characteristics in the images. The WPT consists of a set of base functions that represents the image in different bands of frequency, each one with distinct resolutions corresponding to each scale. Moreover, the characteristics of color of the injury are also computed that are dependants of a visual context, influenced for the existing colors in its surround, and the attributes of form through the Fourier describers. The Support Vector Machine is used for the classification task, which is based on the minimization principles of the structural risk, coming from the statistical learning theory. The SVM has the objective to construct optimum hyperplanes that represent the separation between classes. The generated hyperplane is determined by a subset of the classes, called support vectors. For the used database in this work, the results had revealed a good performance getting a global rightness of 92,73% for melanoma, and 86% for non-melanoma and benign injuries. The extracted describers and the SVM classifier became a method capable to recognize and to classify the analyzed skin injuries

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Um programa baseado na técnica de evolução diferencial foi desenvolvido para a definição da contribuição genética ótima na seleção de candidatos a reprodução. A função- objetivo a ser otimizada foi composta pelo mérito genético esperado da futura progênie e pela coascendência média dos animais em reprodução. Conjuntos de dados reais e simulados de populações com gerações sobrepostas foram usados para validar e testar o desempenho do programa desenvolvido. O programa se mostrou computacionalmente eficiente e viável para ser aplicado na prática e as consequências esperadas de sua aplicação, em comparação a procedimentos empíricos de controle da endogamia e/ou com a seleção baseada apenas no valor genético esperado, seriam a melhora da resposta genética futura e limitação mais efetiva da taxa de endogamia.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents an evaluative study about the effects of using a machine learning technique on the main features of a self-organizing and multiobjective genetic algorithm (GA). A typical GA can be seen as a search technique which is usually applied in problems involving no polynomial complexity. Originally, these algorithms were designed to create methods that seek acceptable solutions to problems where the global optimum is inaccessible or difficult to obtain. At first, the GAs considered only one evaluation function and a single objective optimization. Today, however, implementations that consider several optimization objectives simultaneously (multiobjective algorithms) are common, besides allowing the change of many components of the algorithm dynamically (self-organizing algorithms). At the same time, they are also common combinations of GAs with machine learning techniques to improve some of its characteristics of performance and use. In this work, a GA with a machine learning technique was analyzed and applied in a antenna design. We used a variant of bicubic interpolation technique, called 2D Spline, as machine learning technique to estimate the behavior of a dynamic fitness function, based on the knowledge obtained from a set of laboratory experiments. This fitness function is also called evaluation function and, it is responsible for determining the fitness degree of a candidate solution (individual), in relation to others in the same population. The algorithm can be applied in many areas, including in the field of telecommunications, as projects of antennas and frequency selective surfaces. In this particular work, the presented algorithm was developed to optimize the design of a microstrip antenna, usually used in wireless communication systems for application in Ultra-Wideband (UWB). The algorithm allowed the optimization of two variables of geometry antenna - the length (Ls) and width (Ws) a slit in the ground plane with respect to three objectives: radiated signal bandwidth, return loss and central frequency deviation. These two dimensions (Ws and Ls) are used as variables in three different interpolation functions, one Spline for each optimization objective, to compose a multiobjective and aggregate fitness function. The final result proposed by the algorithm was compared with the simulation program result and the measured result of a physical prototype of the antenna built in the laboratory. In the present study, the algorithm was analyzed with respect to their success degree in relation to four important characteristics of a self-organizing multiobjective GA: performance, flexibility, scalability and accuracy. At the end of the study, it was observed a time increase in algorithm execution in comparison to a common GA, due to the time required for the machine learning process. On the plus side, we notice a sensitive gain with respect to flexibility and accuracy of results, and a prosperous path that indicates directions to the algorithm to allow the optimization problems with "η" variables

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In recent years there has been a significant growth in technologies that modify implant surfaces, reducing healing time and allowing their successful use in areas with low bone density. One of the most widely used techniques is plasma nitration, applied with excellent results in titanium and its alloys, with greater frequency in the manufacture of hip, ankle and shoulder implants. However, its use in dental implants is very limited due to high process temperatures (between 700 C o and 800 C o ), resulting in distortions in these geometrically complex and highly precise components. The aim of the present study is to assess osseointegration and mechanical strength of grade II nitrided titanium samples, through configuration of hollow cathode discharge. Moreover, new formulations are proposed to determine the optimum structural topology of the dental implant under study, in order to perfect its shape, make it efficient, competitive and with high definition. In the nitriding process, the samples were treated at a temperature of 450 C o and pressure of 150 Pa , during 1 hour of treatment. This condition was selected because it obtains the best wettability results in previous studies, where different pressure, temperature and time conditions were systematized. The samples were characterized by X-ray diffraction, scanning electron microscope, roughness, microhardness and wettability. Biomechanical fatigue tests were then conducted. Finally, a formulation using the three dimensional structural topology optimization method was proposed, in conjunction with an hadaptive refinement process. The results showed that plasma nitriding, using the hollow cathode discharge technique, caused changes in the surface texture of test specimens, increases surface roughness, wettability and microhardness when compared to the untreated sample. In the biomechanical fatigue test, the treated implant showed no flaws, after five million cycles, at a maximum fatigue load of 84.46 N. The results of the topological optimization process showed well-defined optimized layouts of the dental implant, with a clear distribution of material and a defined edge

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The ionospheric effect is one of the major errors in GPS data processing over long baselines. As a dispersive medium, it is possible to compute its influence on the GPS signal with the ionosphere-free linear combination of L1 and L2 observables, requiring dual-frequency receivers. In the case of single-frequency receivers, ionospheric effects are either neglected or reduced by using a model. In this paper, an alternative for single-frequency users is proposed. It involves multiresolution analysis (MRA) using a wavelet analysis of the double-difference observations to remove the short- and medium-scale ionosphere variations and disturbances, as well as some minor tropospheric effects. Experiments were carried out over three baseline lengths from 50 to 450 km, and the results provided by the proposed method were better than those from dual-frequency receivers. The horizontal root mean square was of about 0.28 m (1 sigma).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The wavelet transform is used to reduce the high frequency multipath of pseudorange and carrier phase GPS double differences (DDs). This transform decomposes the DD signal, thus separating the high frequencies due to multipath effects. After the decomposition, the wavelet shrinkage is performed by thresholding to eliminate the high frequency component. Then the signal can be reconstructed without the high frequency component. We show how to choose the best threshold. Although the high frequency multipath is not the main multipath error component, its correction provides improvements of about 30% in pseudorange average residuals and 24% in carrier phases. The results also show that the ambiguity solutions become more reliable after correcting the high frequency multipath.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of the present study was to compare heart rate variability (HRV) at rest and during exercise using a temporal series obtained with the Polar S810i monitor and a signal from a LYNX® signal conditioner (BIO EMG 1000 model) with a channel configured for the acquisition of ECG signals. Fifteen healthy subjects aged 20.9 ± 1.4 years were analyzed. The subjects remained at rest for 20 min and performed exercise for another 20 min with the workload selected to achieve 60% of submaximal heart rate. RR series were obtained for each individual with a Polar S810i instrument and with an ECG analyzed with a biological signal conditioner. The HRV indices (rMSSD, pNN50, LFnu, HFnu, and LF/HF) were calculated after signal processing and analysis. The unpaired Student t-test and intraclass correlation coefficient were used for data analysis. No statistically significant differences were observed when comparing the values analyzed by means of the two devices for HRV at rest and during exercise. The intraclass correlation coefficient demonstrated satisfactory correlation between the values obtained by the devices at rest (pNN50 = 0.994; rMSSD = 0.995; LFnu = 0.978; HFnu = 0.978; LF/HF = 0.982) and during exercise (pNN50 = 0.869; rMSSD = 0.929; LFnu = 0.973; HFnu = 0.973; LF/HF = 0.942). The calculation of HRV values by means of temporal series obtained from the Polar S810i instrument appears to be as reliable as those obtained by processing the ECG signal captured with a signal conditioner.