914 resultados para coding complexity
Resumo:
A recently proposed mean-field theory of mammalian cortex rhythmogenesis describes the salient features of electrical activity in the cerebral macrocolumn, with the use of inhibitory and excitatory neuronal populations (Liley et al 2002). This model is capable of producing a range of important human EEG (electroencephalogram) features such as the alpha rhythm, the 40 Hz activity thought to be associated with conscious awareness (Bojak & Liley 2007) and the changes in EEG spectral power associated with general anesthetic effect (Bojak & Liley 2005). From the point of view of nonlinear dynamics, the model entails a vast parameter space within which multistability, pseudoperiodic regimes, various routes to chaos, fat fractals and rich bifurcation scenarios occur for physiologically relevant parameter values (van Veen & Liley 2006). The origin and the character of this complex behaviour, and its relevance for EEG activity will be illustrated. The existence of short-lived unstable brain states will also be discussed in terms of the available theoretical and experimental results. A perspective on future analysis will conclude the presentation.
Resumo:
This article reviews the use of complexity theory in planning theory using the theory of metaphors for theory transfer and theory construction. The introduction to the article presents the author's positioning of planning theory. The first section thereafter provides a general background of the trajectory of development of complexity theory and discusses the rationale of using the theory of metaphors for evaluating the use of complexity theory in planning. The second section introduces the workings of metaphors in general and theory-constructing metaphors in particular, drawing out an understanding of how to proceed with an evaluative approach towards an analysis of the use of complexity theory in planning. The third section presents two case studies – reviews of two articles – to illustrate how the framework might be employed. It then discusses the implications of the evaluation for the question ‘can complexity theory contribute to planning?’ The concluding section discusses the employment of the ‘theory of metaphors’ for evaluating theory transfer and draws out normative suggestions for engaging in theory transfer using the metaphorical route.
Resumo:
Both historical and idealized climate model experiments are performed with a variety of Earth system models of intermediate complexity (EMICs) as part of a community contribution to the Intergovernmental Panel on Climate Change Fifth Assessment Report. Historical simulations start at 850 CE and continue through to 2005. The standard simulations include changes in forcing from solar luminosity, Earth's orbital configuration, CO2, additional greenhouse gases, land use, and sulphate and volcanic aerosols. In spite of very different modelled pre-industrial global surface air temperatures, overall 20th century trends in surface air temperature and carbon uptake are reasonably well simulated when compared to observed trends. Land carbon fluxes show much more variation between models than ocean carbon fluxes, and recent land fluxes appear to be slightly underestimated. It is possible that recent modelled climate trends or climate–carbon feedbacks are overestimated resulting in too much land carbon loss or that carbon uptake due to CO2 and/or nitrogen fertilization is underestimated. Several one thousand year long, idealized, 2 × and 4 × CO2 experiments are used to quantify standard model characteristics, including transient and equilibrium climate sensitivities, and climate–carbon feedbacks. The values from EMICs generally fall within the range given by general circulation models. Seven additional historical simulations, each including a single specified forcing, are used to assess the contributions of different climate forcings to the overall climate and carbon cycle response. The response of surface air temperature is the linear sum of the individual forcings, while the carbon cycle response shows a non-linear interaction between land-use change and CO2 forcings for some models. Finally, the preindustrial portions of the last millennium simulations are used to assess historical model carbon-climate feedbacks. Given the specified forcing, there is a tendency for the EMICs to underestimate the drop in surface air temperature and CO2 between the Medieval Climate Anomaly and the Little Ice Age estimated from palaeoclimate reconstructions. This in turn could be a result of unforced variability within the climate system, uncertainty in the reconstructions of temperature and CO2, errors in the reconstructions of forcing used to drive the models, or the incomplete representation of certain processes within the models. Given the forcing datasets used in this study, the models calculate significant land-use emissions over the pre-industrial period. This implies that land-use emissions might need to be taken into account, when making estimates of climate–carbon feedbacks from palaeoclimate reconstructions.
Resumo:
The nonlinearity of high-power amplifiers (HPAs) has a crucial effect on the performance of multiple-input-multiple-output (MIMO) systems. In this paper, we investigate the performance of MIMO orthogonal space-time block coding (OSTBC) systems in the presence of nonlinear HPAs. Specifically, we propose a constellation-based compensation method for HPA nonlinearity in the case with knowledge of the HPA parameters at the transmitter and receiver, where the constellation and decision regions of the distorted transmitted signal are derived in advance. Furthermore, in the scenario without knowledge of the HPA parameters, a sequential Monte Carlo (SMC)-based compensation method for the HPA nonlinearity is proposed, which first estimates the channel-gain matrix by means of the SMC method and then uses the SMC-based algorithm to detect the desired signal. The performance of the MIMO-OSTBC system under study is evaluated in terms of average symbol error probability (SEP), total degradation (TD) and system capacity, in uncorrelated Nakagami-m fading channels. Numerical and simulation results are provided and show the effects on performance of several system parameters, such as the parameters of the HPA model, output back-off (OBO) of nonlinear HPA, numbers of transmit and receive antennas, modulation order of quadrature amplitude modulation (QAM), and number of SMC samples. In particular, it is shown that the constellation-based compensation method can efficiently mitigate the effect of HPA nonlinearity with low complexity and that the SMC-based detection scheme is efficient to compensate for HPA nonlinearity in the case without knowledge of the HPA parameters.
Resumo:
Prism is a modular classification rule generation method based on the ‘separate and conquer’ approach that is alternative to the rule induction approach using decision trees also known as ‘divide and conquer’. Prism often achieves a similar level of classification accuracy compared with decision trees, but tends to produce a more compact noise tolerant set of classification rules. As with other classification rule generation methods, a principle problem arising with Prism is that of overfitting due to over-specialised rules. In addition, over-specialised rules increase the associated computational complexity. These problems can be solved by pruning methods. For the Prism method, two pruning algorithms have been introduced recently for reducing overfitting of classification rules - J-pruning and Jmax-pruning. Both algorithms are based on the J-measure, an information theoretic means for quantifying the theoretical information content of a rule. Jmax-pruning attempts to exploit the J-measure to its full potential because J-pruning does not actually achieve this and may even lead to underfitting. A series of experiments have proved that Jmax-pruning may outperform J-pruning in reducing overfitting. However, Jmax-pruning is computationally relatively expensive and may also lead to underfitting. This paper reviews the Prism method and the two existing pruning algorithms above. It also proposes a novel pruning algorithm called Jmid-pruning. The latter is based on the J-measure and it reduces overfitting to a similar level as the other two algorithms but is better in avoiding underfitting and unnecessary computational effort. The authors conduct an experimental study on the performance of the Jmid-pruning algorithm in terms of classification accuracy and computational efficiency. The algorithm is also evaluated comparatively with the J-pruning and Jmax-pruning algorithms.
Resumo:
An extensive off-line evaluation of the Noah/Single Layer Urban Canopy Model (Noah/SLUCM) urban land-surface model is presented using data from 15 sites to assess (1) the ability of the scheme to reproduce the surface energy balance observed in a range of urban environments, including seasonal changes, and (2) the impact of increasing complexity of input parameter information. Model performance is found to be most dependent on representation of vegetated surface area cover; refinement of other parameter values leads to smaller improvements. Model biases in net all-wave radiation and trade-offs between turbulent heat fluxes are highlighted using an optimization algorithm. Here we use the Urban Zones to characterize Energy partitioning (UZE) as the basis to assign default SLUCM parameter values. A methodology (FRAISE) to assign sites (or areas) to one of these categories based on surface characteristics is evaluated. Using three urban sites from the Basel Urban Boundary Layer Experiment (BUBBLE) dataset, an independent evaluation of the model performance with the parameter values representative of each class is performed. The scheme copes well with both seasonal changes in the surface characteristics and intra-urban heterogeneities in energy flux partitioning, with RMSE performance comparable to similar state-of-the-art models for all fluxes, sites and seasons. The potential of the methodology for high-resolution atmospheric modelling application using the Weather Research and Forecasting (WRF) model is highlighted. This analysis supports the recommendations that (1) three classes are appropriate to characterize the urban environment, and (2) that the parameter values identified should be adopted as default values in WRF.
Resumo:
Huntingtin (Htt) protein interacts with many transcriptional regulators, with widespread disruption to the transcriptome in Huntington's disease (HD) brought about by altered interactions with the mutant Htt (muHtt) protein. Repressor Element-1 Silencing Transcription Factor (REST) is a repressor whose association with Htt in the cytoplasm is disrupted in HD, leading to increased nuclear REST and concomitant repression of several neuronal-specific genes, including brain-derived neurotrophic factor (Bdnf). Here, we explored a wide set of HD dysregulated genes to identify direct REST targets whose expression is altered in a cellular model of HD but that can be rescued by knock-down of REST activity. We found many direct REST target genes encoding proteins important for nervous system development, including a cohort involved in synaptic transmission, at least two of which can be rescued at the protein level by REST knock-down. We also identified several microRNAs (miRNAs) whose aberrant repression is directly mediated by REST, including miR-137, which has not previously been shown to be a direct REST target in mouse. These data provide evidence of the contribution of inappropriate REST-mediated transcriptional repression to the widespread changes in coding and non-coding gene expression in a cellular model of HD that may affect normal neuronal function and survival.
Resumo:
HD (Huntington's disease) is a late onset heritable neurodegenerative disorder that is characterized by neuronal dysfunction and death, particularly in the cerebral cortex and medium spiny neurons of the striatum. This is followed by progressive chorea, dementia and emotional dysfunction, eventually resulting in death. HD is caused by an expanded CAG repeat in the first exon of the HD gene that results in an abnormally elongated polyQ (polyglutamine) tract in its protein product, Htt (Huntingtin). Wild-type Htt is largely cytoplasmic; however, in HD, proteolytic N-terminal fragments of Htt form insoluble deposits in both the cytoplasm and nucleus, provoking the idea that mutHtt (mutant Htt) causes transcriptional dysfunction. While a number of specific transcription factors and co-factors have been proposed as mediators of mutHtt toxicity, the causal relationship between these Htt/transcription factor interactions and HD pathology remains unknown. Previous work has highlighted REST [RE1 (repressor element 1)-silencing transcription factor] as one such transcription factor. REST is a master regulator of neuronal genes, repressing their expression. Many of its direct target genes are known or suspected to have a role in HD pathogenesis, including BDNF (brain-derived neurotrophic factor). Recent evidence has also shown that REST regulates transcription of regulatory miRNAs (microRNAs), many of which are known to regulate neuronal gene expression and are dysregulated in HD. Thus repression of miRNAs constitutes a second, indirect mechanism by which REST can alter the neuronal transcriptome in HD. We will describe the evidence that disruption to the REST regulon brought about by a loss of interaction between REST and mutHtt may be a key contributory factor in the widespread dysregulation of gene expression in HD.
Resumo:
Enterprise Architecture (EA) has been recognised as an important tool in modern business management for closing the gap between strategy and its execution. The current literature implies that for EA to be successful, it should have clearly defined goals. However, the goals of different stakeholders are found to be different, even contradictory. In our explorative research, we seek an answer to the questions: What kind of goals are set for the EA implementation? How do the goals evolve during the time? Are the goals different among stakeholders? How do they affect the success of EA? We analysed an EA pilot conducted among eleven Finnish Higher Education Institutions (HEIs) in 2011. The goals of the pilot were gathered from three different stages of the pilot: before the pilot, during the pilot, and after the pilot, by means of a project plan, interviews during the pilot and a questionnaire after the pilot. The data was analysed using qualitative and quantitative methods. Eight distinct goals were recognised by the coding: Adopt EA Method, Build Information Systems, Business Development, Improve Reporting, Process Improvement, Quality Assurance, Reduce Complexity, and Understand the Big Picture. The success of the pilot was analysed statistically using the scale 1-5. Results revealed that goals set before the pilot were very different from those mentioned during the pilot, or after the pilot. Goals before the pilot were mostly related to expected benefits from the pilot, whereas the most important result was to adopt the EA method. Results can be explained by possibly different roles of respondents, which in turn were most likely caused by poor communication. Interestingly, goals mentioned by different stakeholders were not limited to their traditional areas of responsibility. For example, in some cases Chief Information Officers' goals were Quality Assurance and Process Improvement, whereas managers’ goals were Build Information Systems and Adopt EA Method. This could be a result of a good understanding of the meaning of EA, or stakeholders do not regard EA as their concern at all. It is also interesting to notice that regardless of the different perceptions of goals among stakeholders, all HEIs felt the pilot to be successful. Thus the research does not provide support to confirm the link between clear goals and success.
Resumo:
The incidence and severity of light leaf spot epidemics caused by the ascomycete fungus Pyrenopeziza brassicae on UK oilseed rape crops is increasing. The disease is currently controlled by a combination of host resistance, cultural practices and fungicide applications. We report decreases in sensitivities of modern UK P. brassicae isolates to the azole (imidazole and triazole) class of fungicides. By cloning and sequencing the P. brassicae CYP51 (PbCYP51) gene, encoding the azole target sterol 14α-demethylase, we identified two non-synonymous mutations encoding substitutions G460S and S508T associated with reduced azole sensitivity. We confirmed the impact of the encoded PbCYP51 changes on azole sensitivity and protein activity by heterologous expression in a Saccharomyces cerevisiae mutant YUG37::erg11 carrying a controllable promoter of native CYP51 expression. In addition, we identified insertions in the predicted regulatory regions of PbCYP51 in isolates with reduced azole sensitivity. The presence of these insertions was associated with enhanced transcription of PbCYP51 in response to sub-inhibitory concentrations of the azole fungicide tebuconazole. Genetic analysis of in vitro crosses of sensitive and resistant isolates confirmed the impact of PbCYP51 alterations in coding and regulatory sequences on a reduced sensitivity phenotype, as well as identifying a second major gene at another locus contributing to resistance in some isolates. The least sensitive field isolates carry combinations of upstream insertions and non-synonymous mutations, suggesting PbCYP51 evolution is on-going and the progressive decline in azole sensitivity of UK P. brassicae populations will continue. The implications for the future control of light leaf spot are discussed.
Resumo:
Low-power medium access control (MAC) protocols used for communication of energy constraint wireless embedded devices do not cope well with situations where transmission channels are highly erroneous. Existing MAC protocols discard corrupted messages which lead to costly retransmissions. To improve transmission performance, it is possible to include an error correction scheme and transmit/receive diversity. It is possible to add redundant information to transmitted packets in order to recover data from corrupted packets. It is also possible to make use of transmit/receive diversity via multiple antennas to improve error resiliency of transmissions. Both schemes may be used in conjunction to further improve the performance. In this study, the authors show how an error correction scheme and transmit/receive diversity can be integrated in low-power MAC protocols. Furthermore, the authors investigate the achievable performance gains of both methods. This is important as both methods have associated costs (processing requirements; additional antennas and power) and for a given communication situation it must be decided which methods should be employed. The authors’ results show that, in many practical situations, error control coding outperforms transmission diversity; however, if very high reliability is required, it is useful to employ both schemes together.
Resumo:
Greater self-complexity has been suggested as a protective factor for people under stress (Linville, 1985). Two different measures have been proposed to assess individual self-complexity: Attneave’s H statistic (1959) and a composite index of two components of self-complexity (SC; Rafaeli-Mor et al., 1999). Using mood-incongruent recall, i.e., recalling positive events while in negative mood, the present study compared validity of the two measures through reanalysis of Sakaki’s (2004) data. Results indicated that H statistic did not predict performance of mood-incongruent recall. In contrast, greater SC was associated with better mood-incongruent recall even when the effect of H statistic was controlled.
Resumo:
The synthesis and characterization of five new indium selenides, [C9H17N2]3[In5Se8+x(Se2)1−x] (1–2), [C6H12N2]4[C6H14N2]3[In10Se15(Se2)3] (3), [C6H14N2][(C6H12N2)2NaIn5Se9] (4) and [enH2][NH4][In7Se12] (5), are described. These materials were prepared under solvothermal conditions, using 1,8-diazabicyclo[5.4.0]undec-7-ene (DBU) and 1,4-diazabicyclo[2.2.2]octane (DABCO) as structure-directing agents. Compounds 1–4 represent the first examples of ribbons in indium selenides, and 4 is the first example of incorporation of an alkali metal complex. Compounds 1, 2 and 4 contain closely related [In5Se8+x(Se2)1−x]3− ribbons which differ only in their content of (Se2)2− anions. These ribbons are interspaced by organic countercations in 1 and 2, while in 4 they are linked by highly unusual [Na(DABCO)2]+ units into a three-dimensional framework. Compound 3 contains complex ribbons, with a long repeating sequence of ca. 36 Å, and 4 is a non-centrosymmetric three-dimensional framework, formed as a consequence of the decomposition of DABCO into ethylenediamine (en) and ammonia.