909 resultados para coding complexity


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cooperative transmission can be seen as a "virtual" MIMO system, where themultiple transmit antennas are in fact implemented distributed by the antennas both at the source and the relay terminal. Depending on the system design, diversity/multiplexing gainsare achievable. This design involves the definition of the type of retransmission (incrementalredundancy, repetition coding), the design of the distributed space-time codes, the errorcorrecting scheme, the operation of the relay (decode&forward or amplify&forward) and thenumber of antennas at each terminal. Proposed schemes are evaluated in different conditionsin combination with forward error correcting codes (FEC), both for linear and near-optimum(sphere decoder) receivers, for its possible implementation in downlink high speed packetservices of cellular networks. Results show the benefits of coded cooperation over directtransmission in terms of increased throughput. It is shown that multiplexing gains areobserved even if the mobile station features a single antenna, provided that cell wide reuse of the relay radio resource is possible.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a programming environment for supporting learning in STEM, particularly mobile robotic learning. It was designed to maintain progressive learning for people with and without previous knowledge of programming and/or robotics. The environment was multi platform and built with open source tools. Perception, mobility, communication, navigation and collaborative behaviour functionalities can be programmed for different mobile robots. A learner is able to programme robots using different programming languages and editor interfaces: graphic programming interface (basic level), XML-based meta language (intermediate level) or ANSI C language (advanced level). The environment supports programme translation transparently into different languages for learners or explicitly on learners’ demand. Learners can access proposed challenges and learning interfaces by examples. The environment was designed to allow characteristics such as extensibility, adaptive interfaces, persistence and low software/hardware coupling. Functionality tests were performed to prove programming environment specifications. UV BOT mobile robots were used in these tests

Relevância:

20.00% 20.00%

Publicador:

Resumo:

El principal objectiu d'aquest treball és implementar i exposar una descripció teòrica per a diferents esquemes de Physical Layer Network Coding. Utilitzant un esquema bàsic com a punt de partida, el projecte presenta la construcció i l'anàlisis de diferents esquemes de comunicació on la complexitat va augmentant a mesura que anem avançant en el projecte. El treball està estructurat en diferents parts: primer, es presenta una introducció a Physical Layer Network Coding i a Lattice Network Codes. A continuació, s'introdueixen les eines matemàtiques necessàries per entendre el CF System. Després, s'analitza i implementa el primer esquema bàsic. A partir del qual, implementem una versió vectorial del CF System i una versió codificada amb un Hamming q-ari. Finalment, s'estudien i implementen diferents estratègies per millorar la matriu de coeficients A.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article draws on empirical material to reflect on what drives rapid change in flood risk management practice, reflecting wider interest in the way that scientific practices make risk landscapes and a specific focus on extreme events as drivers of rapid change. Such events are commonly referred to as a form of creative destruction, ones that reveal both the composition of socioenvironmental assemblages and provide a creative opportunity to remake those assemblages in alternate ways, therefore rapidly changing policy and practice. Drawing on wider thinking in complexity theory, we argue that what happens between events might be as, if not more, important than the events themselves. We use two empirical examples concerned with flood risk management practice: a rapid shift in the dominant technologies used to map flood risk in the United Kingdom and an experimental approach to public participation tested in two different locations, with dramatically different consequences. Both show that the state of the socioenvironmental assemblage in which the events take place matters as much as the magnitude of the events themselves. The periods between rapid changes are not simply periods of discursive consolidation but involve the ongoing mutation of such assemblages, which could either sensitize or desensitize them to rapid change. Understanding these intervening periods matters as much as the events themselves. If events matter, it is because of the ways in which they might bring into sharp focus the coding or framing of a socioenvironmental assemblage in policy or scientific practice irrespective of whether or not those events evolve the assemblage in subtle or more radical ways.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

AIM: Heart disease is recognized as a consequence of dysregulation of cardiac gene regulatory networks. Previously, unappreciated components of such networks are the long non-coding RNAs (lncRNAs). Their roles in the heart remain to be elucidated. Thus, this study aimed to systematically characterize the cardiac long non-coding transcriptome post-myocardial infarction and to elucidate their potential roles in cardiac homoeostasis. METHODS AND RESULTS: We annotated the mouse transcriptome after myocardial infarction via RNA sequencing and ab initio transcript reconstruction, and integrated genome-wide approaches to associate specific lncRNAs with developmental processes and physiological parameters. Expression of specific lncRNAs strongly correlated with defined parameters of cardiac dimensions and function. Using chromatin maps to infer lncRNA function, we identified many with potential roles in cardiogenesis and pathological remodelling. The vast majority was associated with active cardiac-specific enhancers. Importantly, oligonucleotide-mediated knockdown implicated novel lncRNAs in controlling expression of key regulatory proteins involved in cardiogenesis. Finally, we identified hundreds of human orthologues and demonstrate that particular candidates were differentially modulated in human heart disease. CONCLUSION: These findings reveal hundreds of novel heart-specific lncRNAs with unique regulatory and functional characteristics relevant to maladaptive remodelling, cardiac function and possibly cardiac regeneration. This new class of molecules represents potential therapeutic targets for cardiac disease. Furthermore, their exquisite correlation with cardiac physiology renders them attractive candidate biomarkers to be used in the clinic.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we describe a taxonomy of task demands which distinguishes between Task Complexity, Task Condition and Task Difficulty. We then describe three theoretical claims and predictions of the Cognition Hypothesis (Robinson 2001, 2003b, 2005a) concerning the effects of task complexity on: (a) language production; (b) interaction and uptake of information available in the input to tasks; and (c) individual differences-task interactions. Finally we summarize the findings of the empirical studies in this special issue which all address one or more of these predictions and point to some directions for continuing, future research into the effects of task complexity on learning and performance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Protein-coding genes evolve at different rates, and the influence of different parameters, from gene size to expression level, has been extensively studied. While in yeast gene expression level is the major causal factor of gene evolutionary rate, the situation is more complex in animals. Here we investigate these relations further, especially taking in account gene expression in different organs as well as indirect correlations between parameters. We used RNA-seq data from two large datasets, covering 22 mouse tissues and 27 human tissues. Over all tissues, evolutionary rate only correlates weakly with levels and breadth of expression. The strongest explanatory factors of purifying selection are GC content, expression in many developmental stages, and expression in brain tissues. While the main component of evolutionary rate is purifying selection, we also find tissue-specific patterns for sites under neutral evolution and for positive selection. We observe fast evolution of genes expressed in testis, but also in other tissues, notably liver, which are explained by weak purifying selection rather than by positive selection.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The increase of publicly available sequencing data has allowed for rapid progress in our understanding of genome composition. As new information becomes available we should constantly be updating and reanalyzing existing and newly acquired data. In this report we focus on transposable elements (TEs) which make up a significant portion of nearly all sequenced genomes. Our ability to accurately identify and classify these sequences is critical to understanding their impact on host genomes. At the same time, as we demonstrate in this report, problems with existing classification schemes have led to significant misunderstandings of the evolution of both TE sequences and their host genomes. In a pioneering publication Finnegan (1989) proposed classifying all TE sequences into two classes based on transposition mechanisms and structural features: the retrotransposons (class I) and the DNA transposons (class II). We have retraced how ideas regarding TE classification and annotation in both prokaryotic and eukaryotic scientific communities have changed over time. This has led us to observe that: (1) a number of TEs have convergent structural features and/or transposition mechanisms that have led to misleading conclusions regarding their classification, (2) the evolution of TEs is similar to that of viruses by having several unrelated origins, (3) there might be at least 8 classes and 12 orders of TEs including 10 novel orders. In an effort to address these classification issues we propose: (1) the outline of a universal TE classification, (2) a set of methods and classification rules that could be used by all scientific communities involved in the study of TEs, and (3) a 5-year schedule for the establishment of an International Committee for Taxonomy of Transposable Elements (ICTTE).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Maximum entropy modeling (Maxent) is a widely used algorithm for predicting species distributions across space and time. Properly assessing the uncertainty in such predictions is non-trivial and requires validation with independent datasets. Notably, model complexity (number of model parameters) remains a major concern in relation to overfitting and, hence, transferability of Maxent models. An emerging approach is to validate the cross-temporal transferability of model predictions using paleoecological data. In this study, we assess the effect of model complexity on the performance of Maxent projections across time using two European plant species (Alnus giutinosa (L.) Gaertn. and Corylus avellana L) with an extensive late Quaternary fossil record in Spain as a study case. We fit 110 models with different levels of complexity under present time and tested model performance using AUC (area under the receiver operating characteristic curve) and AlCc (corrected Akaike Information Criterion) through the standard procedure of randomly partitioning current occurrence data. We then compared these results to an independent validation by projecting the models to mid-Holocene (6000 years before present) climatic conditions in Spain to assess their ability to predict fossil pollen presence-absence and abundance. We find that calibrating Maxent models with default settings result in the generation of overly complex models. While model performance increased with model complexity when predicting current distributions, it was higher with intermediate complexity when predicting mid-Holocene distributions. Hence, models of intermediate complexity resulted in the best trade-off to predict species distributions across time. Reliable temporal model transferability is especially relevant for forecasting species distributions under future climate change. Consequently, species-specific model tuning should be used to find the best modeling settings to control for complexity, notably with paleoecological data to independently validate model projections. For cross-temporal projections of species distributions for which paleoecological data is not available, models of intermediate complexity should be selected.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper analyses the effects of manipulating the cognitive complexity of L2 oral tasks on language production. It specifically focuses on self-repairs, which are taken as a measure of accuracy since they denote both attention to form and an attempt at being accurate. By means of a repeated measures de- sign, 42 lower-intermediate students were asked to perform three different tasks types (a narrative, and instruction-giving task, and a decision-making task) for which two degrees of cognitive complexity were established. The narrative task was manipulated along +/− Here-and-Now, an instruction-giving task ma- nipulated along +/− elements, and the decision-making task which is manipu- lated along +/− reasoning demands. Repeated measures ANOVAs are used for the calculation of differences between degrees of complexity and among task types. One-way ANOVA are used to detect potential differences between low- proficiency and high-proficiency participants. Results show an overall effect of Task Complexity on self-repairs behavior across task types, with different be- haviors existing among the three task types. No differences are found between the self-repair behavior between low and high proficiency groups. Results are discussed in the light of theories of cognition and L2 performance (Robin- son 2001a, 2001b, 2003, 2005, 2007), L1 and L2 language production models (Levelt 1989, 1993; Kormos 2000, 2006), and attention during L2 performance (Skehan 1998; Robinson, 2002).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Experimental animal models are essential to obtain basic knowledge of the underlying biological mechanisms in human diseases. Here, we review major contributions to biomedical research and discoveries that were obtained in the mouse model by using forward genetics approaches and that provided key insights into the biology of human diseases and paved the way for the development of novel therapeutic approaches.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A method for optimizing the strength of a parametric phase mask for a wavefront coding imaging system is presented. The method is based on an optimization process that minimizes a proposed merit function. The goal is to achieve modulation transfer function invariance while quantitatively maintaining nal image delity. A parametric lter that copes with the noise present in the captured images is used to obtain the nal images, and this lter is optimized. The whole process results in optimum phase mask strength and optimal parameters for the restoration lter. The results for a particular optical system are presented and tested experimentally in the labo- ratory. The experimental results show good agreement with the simulations, indicating that the procedure is useful.