927 resultados para Error correction coding
Resumo:
The presence of high phase noise in addition to additive white Gaussian noise in coherent optical systems affects the performance of forward error correction (FEC) schemes. In this paper, we propose a simple scheme for such systems, using block interleavers and binary Bose–Chaudhuri–Hocquenghem (BCH) codes. The block interleavers are specifically optimized for differential quadrature phase shift keying modulation. We propose a method for selecting BCH codes that, together with the interleavers, achieve a target post-FEC bit error rate (BER). This combination of interleavers and BCH codes has very low implementation complexity. In addition, our approach is straightforward, requiring only short pre-FEC simulations to parameterize a model, based on which we select codes analytically. We aim to correct a pre-FEC BER of around (Formula presented.). We evaluate the accuracy of our approach using numerical simulations. For a target post-FEC BER of (Formula presented.), codes selected using our method result in BERs around 3(Formula presented.) target and achieve the target with around 0.2 dB extra signal-to-noise ratio.
Resumo:
We develop a framework for estimating the quality of transmission (QoT) of a new lightpath before it is established, as well as for calculating the expected degradation it will cause to existing lightpaths. The framework correlates the QoT metrics of established lightpaths, which are readily available from coherent optical receivers that can be extended to serve as optical performance monitors. Past similar studies used only space (routing) information and thus neglected spectrum, while they focused on oldgeneration noncoherent networks. The proposed framework accounts for correlation in both the space and spectrum domains and can be applied to both fixed-grid wavelength division multiplexing (WDM) and elastic optical networks. It is based on a graph transformation that exposes and models the interference between spectrum-neighboring channels. Our results indicate that our QoT estimates are very close to the actual performance data, that is, to having perfect knowledge of the physical layer. The proposed estimation framework is shown to provide up to 4 × 10-2 lower pre-forward error correction bit error ratio (BER) compared to theworst-case interference scenario,which overestimates the BER. The higher accuracy can be harvested when lightpaths are provisioned with low margins; our results showed up to 47% reduction in required regenerators, a substantial savings in equipment cost.
Resumo:
Brain-computer interfaces (BCI) have the potential to restore communication or control abilities in individuals with severe neuromuscular limitations, such as those with amyotrophic lateral sclerosis (ALS). The role of a BCI is to extract and decode relevant information that conveys a user's intent directly from brain electro-physiological signals and translate this information into executable commands to control external devices. However, the BCI decision-making process is error-prone due to noisy electro-physiological data, representing the classic problem of efficiently transmitting and receiving information via a noisy communication channel.
This research focuses on P300-based BCIs which rely predominantly on event-related potentials (ERP) that are elicited as a function of a user's uncertainty regarding stimulus events, in either an acoustic or a visual oddball recognition task. The P300-based BCI system enables users to communicate messages from a set of choices by selecting a target character or icon that conveys a desired intent or action. P300-based BCIs have been widely researched as a communication alternative, especially in individuals with ALS who represent a target BCI user population. For the P300-based BCI, repeated data measurements are required to enhance the low signal-to-noise ratio of the elicited ERPs embedded in electroencephalography (EEG) data, in order to improve the accuracy of the target character estimation process. As a result, BCIs have relatively slower speeds when compared to other commercial assistive communication devices, and this limits BCI adoption by their target user population. The goal of this research is to develop algorithms that take into account the physical limitations of the target BCI population to improve the efficiency of ERP-based spellers for real-world communication.
In this work, it is hypothesised that building adaptive capabilities into the BCI framework can potentially give the BCI system the flexibility to improve performance by adjusting system parameters in response to changing user inputs. The research in this work addresses three potential areas for improvement within the P300 speller framework: information optimisation, target character estimation and error correction. The visual interface and its operation control the method by which the ERPs are elicited through the presentation of stimulus events. The parameters of the stimulus presentation paradigm can be modified to modulate and enhance the elicited ERPs. A new stimulus presentation paradigm is developed in order to maximise the information content that is presented to the user by tuning stimulus paradigm parameters to positively affect performance. Internally, the BCI system determines the amount of data to collect and the method by which these data are processed to estimate the user's target character. Algorithms that exploit language information are developed to enhance the target character estimation process and to correct erroneous BCI selections. In addition, a new model-based method to predict BCI performance is developed, an approach which is independent of stimulus presentation paradigm and accounts for dynamic data collection. The studies presented in this work provide evidence that the proposed methods for incorporating adaptive strategies in the three areas have the potential to significantly improve BCI communication rates, and the proposed method for predicting BCI performance provides a reliable means to pre-assess BCI performance without extensive online testing.
Resumo:
Atomic ions trapped in micro-fabricated surface traps can be utilized as a physical platform with which to build a quantum computer. They possess many of the desirable qualities of such a device, including high fidelity state preparation and readout, universal logic gates, long coherence times, and can be readily entangled with each other through photonic interconnects. The use of optical cavities integrated with trapped ion qubits as a photonic interface presents the possibility for order of magnitude improvements in performance in several key areas of their use in quantum computation. The first part of this thesis describes the design and fabrication of a novel surface trap for integration with an optical cavity. The trap is custom made on a highly reflective mirror surface and includes the capability of moving the ion trap location along all three trap axes with nanometer scale precision. The second part of this thesis demonstrates the suitability of small micro-cavities formed from laser ablated fused silica substrates with radii of curvature in the 300-500 micron range for use with the mirror trap as part of an integrated ion trap cavity system. Quantum computing applications for such a system include dramatic improvements in the photonic entanglement rate up to 10 kHz, the qubit measurement time down to 1 microsecond, and the measurement error rates down to the 10e-5 range. The final part of this thesis details a performance simulator for exploring the physical resource requirements and performance demands to scale such a quantum computer to sizes capable of performing quantum algorithms beyond the limits of classical computation.
Resumo:
The study examines the short-run and long-run causality running from real economic growth to real foreign direct investment inflows (RFDI). Other variables such as education (involving combination of primary, secondary and tertiary enrolment as a proxy to education), real development finance, unskilled labour, to real RFDI inflows are included in the study. The time series data covering the period of 1983 -2013 are examined. First, I applied Augmented Dicky-Fuller (ADF) technique to test for unit root in variables. Findings shows all variables integrated of order one [I(1)]. Thereafter, Johansen Co-integration Test (JCT) was conducted to establish the relationship among variables. Both trace and maximum Eigen value at 5% level of significance indicate 3 co-integrated equations. Vector error correction method (VECM) was applied to capture short and long-run causality running from education, economic growth, real development finance, and unskilled labour to real foreign direct investment inflows in the Republic of Rwanda. Findings shows no short-run causality running from education, real development finance, real GDP and unskilled labour to real FDI inflows, however there were existence of long-run causality. This can be interpreted that, in the short-run; education, development finance, finance and economic growth does not influence inflows of foreign direct investment in Rwanda; but it does in long-run. From the policy perspective, the Republic of Rwanda should focus more on long term goal of investing in education to improve human capital, undertake policy reforms that promotes economic growth, in addition to promoting good governance to attract development finance – especially from Nordics countries (particularly Norway and Denmark).
Resumo:
In this paper, we show how the polarisation state of a linearly polarised antenna can be recovered through the use of a three-term error correction model. The approach adopted is shown to be robust in situations where some multipath exists and where the sampling channels are imperfect with regard to both their amplitude and phase tracking. In particular, it has been shown that error of the measured polarisation tilt angle can be improved from 33% to 3% and below by applying the proposed calibration method. It is described how one can use a rotating dipole antenna as both the calibration standard and as the polarisation encoder, thus simplifying the physical arrangement of the transmitter. Experimental results are provided in order to show the utility of the approach, which could have a variety of applications including bandwidth conservative polarisation sub-modulation in advanced wireless communications systems.
Resumo:
Data from the World Federation of Exchanges show that Brazil’s Sao Paulo stock exchange is one of the largest worldwide in terms of market value. Thus, the objective of this study is to obtain univariate and bivariate forecasting models based on intraday data from the futures and spot markets of the BOVESPA index. The interest is to verify if there exist arbitrage opportunities in Brazilian financial market. To this end, three econometric forecasting models were built: ARFIMA, vector autoregressive (VAR), and vector error correction (VEC). Furthermore, it presents the results of a Granger causality test for the aforementioned series. This type of study shows that it is important to identify arbitrage opportunities in financial markets and, in particular, in the application of these models on data of this nature. In terms of the forecasts made with these models, VEC showed better results. The causality test shows that futures BOVESPA index Granger causes spot BOVESPA index. This result may indicate arbitrage opportunities in Brazil.
Resumo:
Cette thèse porte sur l’effet du risque de prix sur la décision des agriculteurs et les transformateurs québécois. Elle se divise en trois chapitres. Le premier chapitre revient sur la littérature. Le deuxième chapitre examine l’effet du risque de prix sur la production de trois produits, à savoir le maïs grain, la viande de porc et la viande d’agneau dans la province Québec. Le dernier chapitre est centré sur l’analyse de changement des préférences du transformateur québécois de porc pour ce qui est du choix de marché. Le premier chapitre vise à montrer l’importance de l’effet du risque du prix sur la quantité produite par les agriculteurs, tel que mis en évidence par la littérature. En effet, la littérature révèle l’importance du risque de prix à l’exportation sur le commerce international. Le deuxième chapitre est consacré à l’étude des facteurs du risque (les anticipations des prix et la volatilité des prix) dans la fonction de l’offre. Un modèle d’hétéroscédasticité conditionnelle autorégressive généralisée (GARCH) est utilisé afin de modéliser ces facteurs du risque. Les paramètres du modèle sont estimés par la méthode de l’Information Complète Maximum Vraisemblance (FIML). Les résultats empiriques montrent l’effet négatif de la volatilité du prix sur la production alors que la prévisibilité des prix a un effet positif sur la quantité produite. Comme attendu, nous constatons que l’application du programme d’assurance-stabilisation des revenus agricoles (ASRA) au Québec induit une plus importante sensibilité de l’offre par rapport au prix effectif (le prix incluant la compensation de l’ASRA) que par rapport au prix du marché. Par ailleurs, l’offre est moins sensible au prix des intrants qu’au prix de l’output. La diminution de l’aversion au risque de producteur est une autre conséquence de l’application de ce programme. En outre, l’estimation de la prime marginale relative au risque révèle que le producteur du maïs est le producteur le moins averse au risque (comparativement à celui de porc ou d’agneau). Le troisième chapitre consiste en l’analyse du changement de préférence du transformateur québécois du porc pour ce qui est du choix de marché. Nous supposons que le transformateur a la possibilité de fournir les produits sur deux marchés : étranger et local. Le modèle théorique explique l’offre relative comme étant une fonction à la fois d’anticipation relative et de volatilité relative des prix. Ainsi, ce modèle révèle que la sensibilité de l’offre relative par rapport à la volatilité relative de prix dépend de deux facteurs : d’une part, la part de l’exportation dans la production totale et d’autre part, l’élasticité de substitution entre les deux marchés. Un modèle à correction d’erreurs est utilisé lors d’estimation des paramètres du modèle. Les résultats montrent l’effet positif et significatif de l’anticipation relative du prix sur l’offre relative à court terme. Ces résultats montrent donc qu’une hausse de la volatilité du prix sur le marché étranger par rapport à celle sur le marché local entraine une baisse de l’offre relative sur le marché étranger à long terme. De plus, selon les résultats, les marchés étranger et local sont plus substituables à long terme qu’à court terme.
Resumo:
This research aims to investigate the Hedge Efficiency and Optimal Hedge Ratio for the future market of cattle, coffee, ethanol, corn and soybean. This paper uses the Optimal Hedge Ratio and Hedge Effectiveness through multivariate GARCH models with error correction, attempting to the possible phenomenon of Optimal Hedge Ratio differential during the crop and intercrop period. The Optimal Hedge Ratio must be bigger in the intercrop period due to the uncertainty related to a possible supply shock (LAZZARINI, 2010). Among the future contracts studied in this research, the coffee, ethanol and soybean contracts were not object of this phenomenon investigation, yet. Furthermore, the corn and ethanol contracts were not object of researches which deal with Dynamic Hedging Strategy. This paper distinguishes itself for including the GARCH model with error correction, which it was never considered when the possible Optimal Hedge Ratio differential during the crop and intercrop period were investigated. The commodities quotation were used as future price in the market future of BM&FBOVESPA and as spot market, the CEPEA index, in the period from May 2010 to June 2013 to cattle, coffee, ethanol and corn, and to August 2012 to soybean, with daily frequency. Similar results were achieved for all the commodities. There is a long term relationship among the spot market and future market, bicausality and the spot market and future market of cattle, coffee, ethanol and corn, and unicausality of the future price of soybean on spot price. The Optimal Hedge Ratio was estimated from three different strategies: linear regression by MQO, BEKK-GARCH diagonal model, and BEKK-GARCH diagonal with intercrop dummy. The MQO regression model, pointed out the Hedge inefficiency, taking into consideration that the Optimal Hedge presented was too low. The second model represents the strategy of dynamic hedge, which collected time variations in the Optimal Hedge. The last Hedge strategy did not detect Optimal Hedge Ratio differential between the crop and intercrop period, therefore, unlikely what they expected, the investor do not need increase his/her investment in the future market during the intercrop
Resumo:
This paper analyzes the dynamics ofthe American Depositary Receipt (ADR) of a Colombian bank (Bancolombia) in relation to its pricing factors (underlying (preferred) shares price, exchange rate and the US market index). The aim is to test if there is a long-term relation among these variables that would imply predictability. One cointegrating relation is found allowing the use of a vector error correction model to examine the transmission of shocks to the underlying prices, the exchange rate, and the US market index. The main finding of this paper is that in the short run, the underlying share price seems to adjust after changes in the ADR price, pointing to the fact that the NYSE (trading market for the ADR) leads the Colombian market. However, in the long run, both, the underlying share price and the ADR price, adjust to changes in one another.
Resumo:
International audience
Resumo:
[EN] Since Long's Interaction Hypothesis (Long, 1983) multiple studies have suggested the need of oral interaction for successful second language learning. Within this perspective, a great deal of research has been carried out to investigate the role of corrective feedback in the process of acquiring a second language, but there are still varied open debates about this issue. This comparative study seeks to contribute to the existing literature on corrective feedback in oral interaction by exploring teachers' corrective techniques and students' response to these corrections. Two learning contexts were observed and compared: a traditional English as a foreign language (EFL) classroom and a Content and Language Integrated Learning (CLIL) classroom .The main aim was to see whether our data conform to the Counterbalance Hypothesis proposed by Lyster and Mori (2006). Although results did not show significant differences between the two contexts, a qualitative analysis of the data shed some light on the differences between these two language teaching settings. The findings point to the need for further research on error correction in EFL and CLIL contexts in order to overcome the limitations of the present study.
Resumo:
This research aims to investigate the Hedge Efficiency and Optimal Hedge Ratio for the future market of cattle, coffee, ethanol, corn and soybean. This paper uses the Optimal Hedge Ratio and Hedge Effectiveness through multivariate GARCH models with error correction, attempting to the possible phenomenon of Optimal Hedge Ratio differential during the crop and intercrop period. The Optimal Hedge Ratio must be bigger in the intercrop period due to the uncertainty related to a possible supply shock (LAZZARINI, 2010). Among the future contracts studied in this research, the coffee, ethanol and soybean contracts were not object of this phenomenon investigation, yet. Furthermore, the corn and ethanol contracts were not object of researches which deal with Dynamic Hedging Strategy. This paper distinguishes itself for including the GARCH model with error correction, which it was never considered when the possible Optimal Hedge Ratio differential during the crop and intercrop period were investigated. The commodities quotation were used as future price in the market future of BM&FBOVESPA and as spot market, the CEPEA index, in the period from May 2010 to June 2013 to cattle, coffee, ethanol and corn, and to August 2012 to soybean, with daily frequency. Similar results were achieved for all the commodities. There is a long term relationship among the spot market and future market, bicausality and the spot market and future market of cattle, coffee, ethanol and corn, and unicausality of the future price of soybean on spot price. The Optimal Hedge Ratio was estimated from three different strategies: linear regression by MQO, BEKK-GARCH diagonal model, and BEKK-GARCH diagonal with intercrop dummy. The MQO regression model, pointed out the Hedge inefficiency, taking into consideration that the Optimal Hedge presented was too low. The second model represents the strategy of dynamic hedge, which collected time variations in the Optimal Hedge. The last Hedge strategy did not detect Optimal Hedge Ratio differential between the crop and intercrop period, therefore, unlikely what they expected, the investor do not need increase his/her investment in the future market during the intercrop
Resumo:
Dissertação de Mestrado, Oncobiologia - Mecanismos Moleculares do Cancro, Departamento de Ciências Biomédicas e Medicina, Universidade do Algarve, 2016
Resumo:
The first chapter provides evidence that aggregate Research and Development (R&D) investment drives a persistent component in productivity growth and that this embodies a risk priced in financial markets. In a semi-endogenous growth model, this component is identified by the R&D in excess of equilibrium levels and can be approximated by the Error Correction Term in the cointegration between R&D and Total Factor Productivity. Empirically, the component results being well defined and it satisfies all key theoretical predictions: it exhibits appropriate persistency, it forecasts productivity growth, and it is associated with a cross-sectional risk premium. CAPM is the most foundational model in financial economics, but is known to empirically underestimate expected returns of low-risk assets and overestimate those with high risk. The second chapter studies how risks omission and funding tightness jointly contribute to explaining this anomaly, with the former affecting the definition of assets’ riskiness and the latter affecting how risk is remunerated. Theoretically, the two effects are shown to counteract each other. Empirically, the spread related to binding leverage constraints is found to be significant at 2% yearly. Nonetheless, average returns of portfolios that exploit this anomaly are found to mostly reflect omitted risks, in contrast to their employment in previous literature. The third chapter studies how ‘sustainability’ of assets affect discount rates, which is intrinsically mediated by the risk profile of the assets themselves. This has implications for the assessment of the sustainability-related spread and for hedging changes in the sustainability concern. This mechanism is tested on the ESG-score dimension for US data, with inconclusive evidence regarding the existence of an ESG-related premium in the first place. Also, the risk profile of the long-short ESG portfolio is not likely to impact the sign of its average returns with respect to the sustainability-spread, for the time being.