865 resultados para DNA Error Correction
Resumo:
Electro-optical transceivers can be implemented employing all-analog signal processing in order to achieve low values of power consumption and latency. This paper shows that the spectral efficiency of such solutions can be increased by combining orthogonal multicarrier techniques and off-the-shelf microwave components. A real-time 108-Gbit/s experiment was performed emulating a wavelength division multiplexing (WDM) system composed of five optical channels. The optical carriers were provided by an externally injected gain switched optical frequency comb. Each optical channel transmitted a 21.6-Gbit/s orthogonal subcarrier multiplexing (SCM) signal that was modulated and demodulated in the electrical domain without the requirement for digital signal processing. The net data rate remained higher than 100 Gbit/s after taking into account forward error correction overheads. The use of orthogonally overlapping subchannels achieves an unprecedented spectral efficiency in all-analog real-time broadband WDM/SCM links.
Resumo:
This dissertation examines the monetary models of exchange rate determination for Brazil, Canada, and two countries in the Caribbean, namely, the Dominican Republic and Jamaica. With the exception of Canada, the others adopted the floating regime during the past ten years.^ The empirical validity of four seminal models in exchange rate economics were determined. Three of these models were entirely classical (Bilson and Frenkel) or Keynesian (Dornbusch) in nature. The fourth model (Real Interest Differential Model) was a mixture of the two schools of economic theory.^ There is no clear empirical evidence of the validity of the monetary models. However, the signs of the coefficients of the nominal interest differential variable were as predicted by the Keynesian hypothesis in the case of Canada and as predicted by the Chicago theorists in the remaining countries. Moreover, in case of Brazil, due to hyperinflation, the exchange rate is heavily influenced by domestic money supply.^ I also tested the purchasing power parity (PPP) for this same set of countries. For both the monetary as well as the PPP hypothesis, I tested for co-integration and applied ordinary least squares estimation procedure. The error correction model was also used for the PPP model, to determine convergence to equilibrium.^ The validity of PPP is also questionable for my set of countries. Endogeinity among the regressors as well as the lack of proper price indices are the contributing factors. More importantly, Central Bank intervention negate rapid adjustment of price and exchange rates to their equilibrium value. However, its forecasting capability for the period 1993-1994 is superior compared to the monetary models in two of the four cases.^ I conclude that in spite of the questionable validity of these models, the monetary models give better results in the case of the "smaller" economies like the Dominican Republic and Jamaica where monetary influences swamp the other determinants of exchange rate. ^
Resumo:
This dissertation investigated the effects of a peer coaching relationship between a special education teacher and two general education teachers. More specifically, a two-tier multiple baseline design across subjects was used to evaluate the effects of peer coaching on the general education teachers' use of effective instructional practices (EIPs) and subsequent effects on the engagement rate and academic performance of students with and without disabilities. The peer coaching process included modeling, direct support, and feedback on the use of effective instructional practices including getting student attention, giving specific directions, asking specific questions with wait time, contingent positive reinforcement, positive error correction, precorrection, prompting, and proximity control. A 30-second partial interval recording procedure was used to observe the general education teachers' use of effective instructional practices and student engagement rates. Student participants' academic performance was measured using weekly quizzes. ^ Peer coaching resulted in an overall increase in the teachers' use of EIPs. One general education teacher had a 30% increase in average EIP use from 46% during the baseline phase to 76% during intervention. Student engagement for her two student participants with and without disabilities indicated an increase from 54% to 69% and from 47% to 65% respectively. Results for the second general education teacher indicated a 34% increase in average EIP use from 55% during the baseline to 89% during intervention. Student engagement for the two student participants with and without disabilities in her class increased from 48% to 83% and from 29% to 71% respectively. Student academic performance showed a small increase. In follow-up observations, the effects of peer coaching on teacher use of EIPs and student engagement and academic performance were maintained. ^ The results of this study suggest that using peer coaching to support general education teachers can be an effective method to improve the educational outcomes of students with and without disabilities in general education. Further research is needed to investigate the effects of peer coaching with other special and general educator partnerships and other student participants. ^
Resumo:
This study investigated the effects of repeated readings on the reading abilities of 4, third-, fourth-, and fifth-grade English language learners (ELLs) with specific learning disabilities (SLD). A multiple baseline probe design across subjects was used to explore the effects of repeated readings on four dependent variables: reading fluency (words read correctly per minute; wpm), number of errors per minute (epm), types of errors per minute, and answer to literal comprehension questions. Data were collected and analyzed during baseline, intervention, generalization probes, and maintenance probes. Throughout the baseline and intervention phases, participants read a passage aloud and received error correction feedback. During baseline, this was followed by fluency and literal comprehension question assessments. During intervention, this was followed by two oral repeated readings of the passage. Then the fluency and literal comprehension question assessments were administered. Generalization probes followed approximately 25% of all sessions and consisted of a single reading of a new passage at the same readability level. Maintenance sessions occurred 2-, 4-, and 6-weeks after the intervention ended. The results of this study indicated that repeated readings had a positive effect on the reading abilities of ELLs with SLD. Participants read more wpm, made fewer epm, and answered more literal comprehension questions correctly. Additionally, on average, generalization scores were higher in intervention than in baseline. Maintenance scores were varied when compared to the last day of intervention, however, with the exception of the number of hesitations committed per minute maintenance scores were higher than baseline means. This study demonstrated that repeated readings improved the reading abilities of ELLs with SLD and that gains were generalized to untaught passages. Maintenance probes 2-, 4-, and 6- weeks following intervention indicated that mean reading fluency, errors per minute, and correct answers to literal comprehensive questions remained above baseline levels. Future research should investigate the use of repeated readings in ELLs with SLD at various stages of reading acquisition. Further, future investigations may examine how repeated readings can be integrated into classroom instruction and assessments.
Resumo:
This study investigated the effects of repeated readings on the reading abilities of 4, third-, fourth-, and fifth-grade English language learners (ELLs) with specific learning disabilities (SLD). A multiple baseline probe design across subjects was used to explore the effects of repeated readings on four dependent variables: reading fluency (words read correctly per minute; wpm), number of errors per minute (epm), types of errors per minute, and answer to literal comprehension questions. Data were collected and analyzed during baseline, intervention, generalization probes, and maintenance probes. Throughout the baseline and intervention phases, participants read a passage aloud and received error correction feedback. During baseline, this was followed by fluency and literal comprehension question assessments. During intervention, this was followed by two oral repeated readings of the passage. Then the fluency and literal comprehension question assessments were administered. Generalization probes followed approximately 25% of all sessions and consisted of a single reading of a new passage at the same readability level. Maintenance sessions occurred 2-, 4-, and 6-weeks after the intervention ended. The results of this study indicated that repeated readings had a positive effect on the reading abilities of ELLs with SLD. Participants read more wpm, made fewer epm, and answered more literal comprehension questions correctly. Additionally, on average, generalization scores were higher in intervention than in baseline. Maintenance scores were varied when compared to the last day of intervention, however, with the exception of the number of hesitations committed per minute maintenance scores were higher than baseline means. This study demonstrated that repeated readings improved the reading abilities of ELLs with SLD and that gains were generalized to untaught passages. Maintenance probes 2-, 4-, and 6- weeks following intervention indicated that mean reading fluency, errors per minute, and correct answers to literal comprehensive questions remained above baseline levels. Future research should investigate the use of repeated readings in ELLs with SLD at various stages of reading acquisition. Further, future investigations may examine how repeated readings can be integrated into classroom instruction and assessments.
Resumo:
The Laurentide Ice Sheet (LIS) was a large, dynamic ice sheet in the early Holocene. The glacial events through Hudson Strait leading to its eventual demise are recorded in the well-dated Labrador shelf core, MD99-2236 from the Cartwright Saddle. We develop a detailed history of the timing of ice-sheet discharge events from the Hudson Strait outlet of the LIS during the Holocene using high-resolution detrital carbonate, ice rafted detritus (IRD), d18O, and sediment color data. Eight detrital carbonate peaks (DCPs) associated with IRD peaks and light oxygen isotope events punctuate the MD99-2236 record between 11.5 and 8.0 ka. We use the stratigraphy of the DCPs developed from MD99-2236 to select the appropriate DeltaR to calibrate the ages of recorded glacial events in Hudson Bay and Hudson Strait such that they match the DCPs in MD99-2236. We associate the eight DCPs with H0, Gold Cove advance, Noble Inlet advance, initial retreat of the Hudson Strait ice stream (HSIS) from Hudson Strait, opening of the Tyrrell Sea, and drainage of glacial lakes Agassiz and Ojibway. The opening of Foxe Channel and retreat of glacial ice from Foxe Basin are represented by a shoulder in the carbonate data. DeltaR of 350 years applied to the radiocarbon ages constraining glacial events H0 through the opening of the Tyrell Sea provided the best match with the MD99-2236 DCPs; DeltaR values and ages from the literature are used for the younger events. A very close age match was achieved between the 8.2 ka cold event in the Greenland ice cores, DCP7 (8.15 ka BP), and the drainage of glacial lakes Agassiz and Ojibway. Our stratigraphic comparison between the DCPs in MD99-2236 and the calibrated ages of Hudson Strait/Bay deglacial events shows that the retreat of the HSIS, the opening of the Tyrell Sea, and the catastrophic drainage of glacial lakes Agassiz and Ojibway at 8.2 ka are separate events that have been combined in previous estimates of the timing of the 8.2 ka event from marine records. SW Iceland shelf core MD99-2256 documents freshwater entrainment into the subpolar gyre from the Hudson Strait outlet via the Labrador, North Atlantic, and Irminger currents. The timing of freshwater release from the LIS Hudson Strait outlet in MD99-2236 matches evidence for freshwater forcing and LIS icebergs carrying foreign minerals to the SW Iceland shelf between 11.5 and 8.2 ka. The congruency of these records supports the conclusion of the entrainment of freshwater from the retreat of the LIS through Hudson Strait into the subpolar gyre and provides specific time periods when pulses of LIS freshwater were present to influence climate.
Resumo:
The presence of high phase noise in addition to additive white Gaussian noise in coherent optical systems affects the performance of forward error correction (FEC) schemes. In this paper, we propose a simple scheme for such systems, using block interleavers and binary Bose–Chaudhuri–Hocquenghem (BCH) codes. The block interleavers are specifically optimized for differential quadrature phase shift keying modulation. We propose a method for selecting BCH codes that, together with the interleavers, achieve a target post-FEC bit error rate (BER). This combination of interleavers and BCH codes has very low implementation complexity. In addition, our approach is straightforward, requiring only short pre-FEC simulations to parameterize a model, based on which we select codes analytically. We aim to correct a pre-FEC BER of around (Formula presented.). We evaluate the accuracy of our approach using numerical simulations. For a target post-FEC BER of (Formula presented.), codes selected using our method result in BERs around 3(Formula presented.) target and achieve the target with around 0.2 dB extra signal-to-noise ratio.
Resumo:
This paper will look at the benefits and limitations of content distribution using Forward Error Correction (FEC) in conjunction with the Transmission Control Protocol (TCP). FEC can be used to reduce the number of retransmissions which would usually result from a lost packet. The requirement for TCP to deal with any losses is then greatly reduced. There are however side-effects to using FEC as a countermeasure to packet loss: an additional requirement for bandwidth. When applications such as real-time video conferencing are needed, delay must be kept to a minimum, and retransmissions are certainly not desirable. A balance, therefore, between additional bandwidth and delay due to retransmissions must be struck. Our results show that the throughput of data can be significantly improved when packet loss occurs using a combination of FEC and TCP, compared to relying solely on TCP for retransmissions. Furthermore, a case study applies the result to demonstrate the achievable improvements in the quality of streaming video perceived by end users.
Resumo:
In this work, we present an adaptive unequal loss protection (ULP) scheme for H264/AVC video transmission over lossy networks. This scheme combines erasure coding, H.264/AVC error resilience techniques and importance measures in video coding. The unequal importance of the video packets is identified in the group of pictures (GOP) and the H.264/AVC data partitioning levels. The presented method can adaptively assign unequal amount of forward error correction (FEC) parity across the video packets according to the network conditions, such as the available network bandwidth, packet loss rate and average packet burst loss length. A near optimal algorithm is developed to deal with the FEC assignment for optimization. The simulation results show that our scheme can effectively utilize network resources such as bandwidth, while improving the quality of the video transmission. In addition, the proposed ULP strategy ensures graceful degradation of the received video quality as the packet loss rate increases. © 2010 IEEE.
Resumo:
We develop a framework for estimating the quality of transmission (QoT) of a new lightpath before it is established, as well as for calculating the expected degradation it will cause to existing lightpaths. The framework correlates the QoT metrics of established lightpaths, which are readily available from coherent optical receivers that can be extended to serve as optical performance monitors. Past similar studies used only space (routing) information and thus neglected spectrum, while they focused on oldgeneration noncoherent networks. The proposed framework accounts for correlation in both the space and spectrum domains and can be applied to both fixed-grid wavelength division multiplexing (WDM) and elastic optical networks. It is based on a graph transformation that exposes and models the interference between spectrum-neighboring channels. Our results indicate that our QoT estimates are very close to the actual performance data, that is, to having perfect knowledge of the physical layer. The proposed estimation framework is shown to provide up to 4 × 10-2 lower pre-forward error correction bit error ratio (BER) compared to theworst-case interference scenario,which overestimates the BER. The higher accuracy can be harvested when lightpaths are provisioned with low margins; our results showed up to 47% reduction in required regenerators, a substantial savings in equipment cost.
Resumo:
We quantify the error statistics and patterning effects in a 5x 40 Gbit/s WDM RZ-DBPSK SMF/DCF fibre link using hybrid Raman/EDFA amplification. We propose an adaptive constrained coding for the suppression of errors due to patterning effects. It is established, that this coding technique can greatly reduce the bit error rate (BER) value even for large BER (BER > 101). The proposed approach can be used in the combination with the forward error correction schemes (FEC) to correct the errors even when real channel BER is outside the FEC workspace.
Resumo:
Brain-computer interfaces (BCI) have the potential to restore communication or control abilities in individuals with severe neuromuscular limitations, such as those with amyotrophic lateral sclerosis (ALS). The role of a BCI is to extract and decode relevant information that conveys a user's intent directly from brain electro-physiological signals and translate this information into executable commands to control external devices. However, the BCI decision-making process is error-prone due to noisy electro-physiological data, representing the classic problem of efficiently transmitting and receiving information via a noisy communication channel.
This research focuses on P300-based BCIs which rely predominantly on event-related potentials (ERP) that are elicited as a function of a user's uncertainty regarding stimulus events, in either an acoustic or a visual oddball recognition task. The P300-based BCI system enables users to communicate messages from a set of choices by selecting a target character or icon that conveys a desired intent or action. P300-based BCIs have been widely researched as a communication alternative, especially in individuals with ALS who represent a target BCI user population. For the P300-based BCI, repeated data measurements are required to enhance the low signal-to-noise ratio of the elicited ERPs embedded in electroencephalography (EEG) data, in order to improve the accuracy of the target character estimation process. As a result, BCIs have relatively slower speeds when compared to other commercial assistive communication devices, and this limits BCI adoption by their target user population. The goal of this research is to develop algorithms that take into account the physical limitations of the target BCI population to improve the efficiency of ERP-based spellers for real-world communication.
In this work, it is hypothesised that building adaptive capabilities into the BCI framework can potentially give the BCI system the flexibility to improve performance by adjusting system parameters in response to changing user inputs. The research in this work addresses three potential areas for improvement within the P300 speller framework: information optimisation, target character estimation and error correction. The visual interface and its operation control the method by which the ERPs are elicited through the presentation of stimulus events. The parameters of the stimulus presentation paradigm can be modified to modulate and enhance the elicited ERPs. A new stimulus presentation paradigm is developed in order to maximise the information content that is presented to the user by tuning stimulus paradigm parameters to positively affect performance. Internally, the BCI system determines the amount of data to collect and the method by which these data are processed to estimate the user's target character. Algorithms that exploit language information are developed to enhance the target character estimation process and to correct erroneous BCI selections. In addition, a new model-based method to predict BCI performance is developed, an approach which is independent of stimulus presentation paradigm and accounts for dynamic data collection. The studies presented in this work provide evidence that the proposed methods for incorporating adaptive strategies in the three areas have the potential to significantly improve BCI communication rates, and the proposed method for predicting BCI performance provides a reliable means to pre-assess BCI performance without extensive online testing.
Resumo:
Atomic ions trapped in micro-fabricated surface traps can be utilized as a physical platform with which to build a quantum computer. They possess many of the desirable qualities of such a device, including high fidelity state preparation and readout, universal logic gates, long coherence times, and can be readily entangled with each other through photonic interconnects. The use of optical cavities integrated with trapped ion qubits as a photonic interface presents the possibility for order of magnitude improvements in performance in several key areas of their use in quantum computation. The first part of this thesis describes the design and fabrication of a novel surface trap for integration with an optical cavity. The trap is custom made on a highly reflective mirror surface and includes the capability of moving the ion trap location along all three trap axes with nanometer scale precision. The second part of this thesis demonstrates the suitability of small micro-cavities formed from laser ablated fused silica substrates with radii of curvature in the 300-500 micron range for use with the mirror trap as part of an integrated ion trap cavity system. Quantum computing applications for such a system include dramatic improvements in the photonic entanglement rate up to 10 kHz, the qubit measurement time down to 1 microsecond, and the measurement error rates down to the 10e-5 range. The final part of this thesis details a performance simulator for exploring the physical resource requirements and performance demands to scale such a quantum computer to sizes capable of performing quantum algorithms beyond the limits of classical computation.
Resumo:
The study examines the short-run and long-run causality running from real economic growth to real foreign direct investment inflows (RFDI). Other variables such as education (involving combination of primary, secondary and tertiary enrolment as a proxy to education), real development finance, unskilled labour, to real RFDI inflows are included in the study. The time series data covering the period of 1983 -2013 are examined. First, I applied Augmented Dicky-Fuller (ADF) technique to test for unit root in variables. Findings shows all variables integrated of order one [I(1)]. Thereafter, Johansen Co-integration Test (JCT) was conducted to establish the relationship among variables. Both trace and maximum Eigen value at 5% level of significance indicate 3 co-integrated equations. Vector error correction method (VECM) was applied to capture short and long-run causality running from education, economic growth, real development finance, and unskilled labour to real foreign direct investment inflows in the Republic of Rwanda. Findings shows no short-run causality running from education, real development finance, real GDP and unskilled labour to real FDI inflows, however there were existence of long-run causality. This can be interpreted that, in the short-run; education, development finance, finance and economic growth does not influence inflows of foreign direct investment in Rwanda; but it does in long-run. From the policy perspective, the Republic of Rwanda should focus more on long term goal of investing in education to improve human capital, undertake policy reforms that promotes economic growth, in addition to promoting good governance to attract development finance – especially from Nordics countries (particularly Norway and Denmark).