921 resultados para Decoding complexity
Resumo:
Corresponding to $C_{0}[n,n-r]$, a binary cyclic code generated by a primitive irreducible polynomial $p(X)\in \mathbb{F}_{2}[X]$ of degree $r=2b$, where $b\in \mathbb{Z}^{+}$, we can constitute a binary cyclic code $C[(n+1)^{3^{k}}-1,(n+1)^{3^{k}}-1-3^{k}r]$, which is generated by primitive irreducible generalized polynomial $p(X^{\frac{1}{3^{k}}})\in \mathbb{F}_{2}[X;\frac{1}{3^{k}}\mathbb{Z}_{0}]$ with degree $3^{k}r$, where $k\in \mathbb{Z}^{+}$. This new code $C$ improves the code rate and has error corrections capability higher than $C_{0}$. The purpose of this study is to establish a decoding procedure for $C_{0}$ by using $C$ in such a way that one can obtain an improved code rate and error-correcting capabilities for $C_{0}$.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
AIM: The purpose of this study was to examine the effect of intensive practice in table-tennis on perceptual, decision-making and motor-systems. Groups of elite (HL=11), intermediate (LL=6) and control (CC=11) performed tasks of different levels. METHODS: All subjects underwent to reaction-time-test and response-time-test consisting of a pointing task to targets placed at distinct distances (15 and 25-cm) on the right and left sides. The ball speed test in forehand and backhand condition just for HL and LL group. RESULTS: In CC group reaction time was higher compared to HL (P< 0.05) group. In the response-time-test, there was a significant main effect of distance (P< 0.0001) and the tennis-table expertise (P= 0.011). In the ball speed test the HL were constantly faster compared to the LL in both forehand stroke (P< 0.0001) and backhand stroke (P< 0.0001). Overall, the forehand stroke was significantly faster than the backhand stroke. CONCLUSION: We can conclude that table-tennis-players have shorter response-times than non-athletes and the tasks of reaction-time and response-time are incapable to distinguish the performance of well-trained table tennis players of the intermediate player, but the ball speed test seems be able to do it.
Resumo:
Doreen Barrie should have subtitled this book "Advocating a Different Identity" because this is its basic thrust. In Barrie's view, today's wealthy, modern, and expansive Alberta should abandon its historic grievances and hostility towards Ottawa. Instead, it should embrace a new narrative emphasizing "the positive qualities Albertans possess . . . the contributions the province has made to the country . . . and that Albertans share fundamental Canadian values with people in other parts of Canada and are eager to playa larger role on the national stage."
Resumo:
In this action research study of my classroom of 7th grade mathematics, I investigated whether the use of decoding would increase the students’ ability to problem solve. I discovered that knowing how to decode a word problem is only one facet of being a successful problem solver. I also discovered that confidence, effective instruction, and practice have an impact on improving problem solving skills. Because of this research, I plan to alter my problem solving guide that will enable it to be used by any classroom teacher. I also plan to keep adding to my math problem solving clue words and share with others. My hope is that I will be able to explain my project to math teachers in my district to make them aware of the importance of knowing the steps to solve a word problem.
Resumo:
Stage-structured population models predict transient population dynamics if the population deviates from the stable stage distribution. Ecologists’ interest in transient dynamics is growing because populations regularly deviate from the stable stage distribution, which can lead to transient dynamics that differ significantly from the stable stage dynamics. Because the structure of a population matrix (i.e., the number of life-history stages) can influence the predicted scale of the deviation, we explored the effect of matrix size on predicted transient dynamics and the resulting amplification of population size. First, we experimentally measured the transition rates between the different life-history stages and the adult fecundity and survival of the aphid, Acythosiphon pisum. Second, we used these data to parameterize models with different numbers of stages. Third, we compared model predictions with empirically measured transient population growth following the introduction of a single adult aphid. We find that the models with the largest number of life-history stages predicted the largest transient population growth rates, but in all models there was a considerable discrepancy between predicted and empirically measured transient peaks and a dramatic underestimation of final population sizes. For instance, the mean population size after 20 days was 2394 aphids compared to the highest predicted population size of 531 aphids; the predicted asymptotic growth rate (λmax) was consistent with the experiments. Possible explanations for this discrepancy are discussed. Includes 4 supplemental files.
Resumo:
This paper addresses the functional reliability and the complexity of reconfigurable antennas using graph models. The correlation between complexity and reliability for any given reconfigurable antenna is defined. Two methods are proposed to reduce failures and improve the reliability of reconfigurable antennas. The failures are caused by the reconfiguration technique or by the surrounding environment. These failure reduction methods proposed are tested and examples are given which verify these methods.
Resumo:
Methods from statistical physics, such as those involving complex networks, have been increasingly used in the quantitative analysis of linguistic phenomena. In this paper, we represented pieces of text with different levels of simplification in co-occurrence networks and found that topological regularity correlated negatively with textual complexity. Furthermore, in less complex texts the distance between concepts, represented as nodes, tended to decrease. The complex networks metrics were treated with multivariate pattern recognition techniques, which allowed us to distinguish between original texts and their simplified versions. For each original text, two simplified versions were generated manually with increasing number of simplification operations. As expected, distinction was easier for the strongly simplified versions, where the most relevant metrics were node strength, shortest paths and diversity. Also, the discrimination of complex texts was improved with higher hierarchical network metrics, thus pointing to the usefulness of considering wider contexts around the concepts. Though the accuracy rate in the distinction was not as high as in methods using deep linguistic knowledge, the complex network approach is still useful for a rapid screening of texts whenever assessing complexity is essential to guarantee accessibility to readers with limited reading ability. Copyright (c) EPLA, 2012
Resumo:
The intention of this paper is to present some Aristotelian arguments regarding the motion on local terrestrial region. Because it is a highly sophisticated and complex explanation dealt with, briefly, the principles and causes that based theoretic sciences in general and in particular physics. Subdivided into eight topics this article in order to facilitate the understanding of these concepts for the reader not familiar with the Aristotelian texts. With intent to avoid an innocent view, anachronistic and linear the citations are of primary sources or commentators of Aristotle's works.
Resumo:
Complexity in time series is an intriguing feature of living dynamical systems, with potential use for identification of system state. Although various methods have been proposed for measuring physiologic complexity, uncorrelated time series are often assigned high values of complexity, errouneously classifying them as a complex physiological signals. Here, we propose and discuss a method for complex system analysis based on generalized statistical formalism and surrogate time series. Sample entropy (SampEn) was rewritten inspired in Tsallis generalized entropy, as function of q parameter (qSampEn). qSDiff curves were calculated, which consist of differences between original and surrogate series qSampEn. We evaluated qSDiff for 125 real heart rate variability (HRV) dynamics, divided into groups of 70 healthy, 44 congestive heart failure (CHF), and 11 atrial fibrillation (AF) subjects, and for simulated series of stochastic and chaotic process. The evaluations showed that, for nonperiodic signals, qSDiff curves have a maximum point (qSDiff(max)) for q not equal 1. Values of q where the maximum point occurs and where qSDiff is zero were also evaluated. Only qSDiff(max) values were capable of distinguish HRV groups (p-values 5.10 x 10(-3); 1.11 x 10(-7), and 5.50 x 10(-7) for healthy vs. CHF, healthy vs. AF, and CHF vs. AF, respectively), consistently with the concept of physiologic complexity, and suggests a potential use for chaotic system analysis. (C) 2012 American Institute of Physics. [http://dx.doi.org/10.1063/1.4758815]
Resumo:
In the past decades, all of the efforts at quantifying systems complexity with a general tool has usually relied on using Shannon's classical information framework to address the disorder of the system through the Boltzmann-Gibbs-Shannon entropy, or one of its extensions. However, in recent years, there were some attempts to tackle the quantification of algorithmic complexities in quantum systems based on the Kolmogorov algorithmic complexity, obtaining some discrepant results against the classical approach. Therefore, an approach to the complexity measure is proposed here, using the quantum information formalism, taking advantage of the generality of the classical-based complexities, and being capable of expressing these systems' complexity on other framework than its algorithmic counterparts. To do so, the Shiner-Davison-Landsberg (SDL) complexity framework is considered jointly with linear entropy for the density operators representing the analyzed systems formalism along with the tangle for the entanglement measure. The proposed measure is then applied in a family of maximally entangled mixed state.
Resumo:
With the financial market globalization, foreign investments became vital for the economies, mainly in emerging countries. In the last decades, Brazilian exchange rates appeared as a good indicator to measure either investors' confidence or risk aversion. Here, some events of global or national financial crisis are analyzed, trying to understand how they influenced the "dollar-real" rate evolution. The theoretical tool to be used is the Lopez-Mancini-Calbet (LMC) complexity measure that, applied to real exchange rate data, has shown good fitness between critical events and measured patterns. (C) 2011 Elsevier B.V. All rights reserved.