966 resultados para Time complexity
Resumo:
The deteriorating relationship between the United Kingdom (UK) and the rest of the EU, including the prospect of a referendum on EU membership, would have dominated the Union’s agenda had it not been for the economic/financial crisis, followed by the external crisis which we are now facing in the East. Precisely because of these crises, it is now time for the incoming European Commission to take the bull by the horns and ensure that the EU can move on from a potential referendum and its possible outcomes. The June European Council noted that “the UK raised some concerns related to the future development of the EU. These concerns will need to be addressed. In this context, the European Council noted that the concept of ever closer union allows for different paths of integration for different countries, allowing those that want to deepen integration to move ahead, while respecting the wish of those who do not want to deepen any further.” While the EU has, arguably, successfully developed at different speeds for decades, to address the UK’s (fundamental) concerns, it is now time to work out whether and how the UK can be accommodated, and what this would mean in practice. UK membership is desirable but not at any price, so the aim should be to keep the UK in, while also ensuring that the principles on which the EU is built are protected. There will need to be a modus operandi which enables the EU and in particular, the Eurozone, to continue to make progress in addressing the shortcomings of European integration and European Monetary Union (EMU) in particular, while at the same time offering a reform package that can satisfy the UK. This does not necessarily mean that all EMU reforms have to be within the EU framework: additional intergovernmental arrangements could also be a possibility. However, this could add to the complexity and inefficiency of the system, as well as sidelining the supranational element of EU governance which will be needed to make EMU function.
Resumo:
The gap in labour market participation between natives and people with an immigrant background is significant in Belgium, one of the largest in the OECD. In this Policy Brief, we present research1 that investigated one of the possible causes of this poor performance, and we propose three main policy recommendations. The research project studied whether Belgium’s complex federal state structure, and the subsequent division of responsibilities and lack of intergovernmental cooperation helps to explain this poor performance. The study concluded that governance complexity does not appear to be a main cause for Belgium’s poor results. However, more policy coordination would improve policy efficiency.
Resumo:
The tremendous diversity of leaf shapes has caught the attention of naturalists for centuries. In addition to interspecific and intraspecific differences, leaf morphologies may differ in single plants according to age, a phenomenon known as heteroblasty. In Arabidopsis thaliana, the progression from the juvenile to the adult phase is characterized by increased leaf serration. A similar trend is seen in species with more complex leaves, such as the A. thaliana relative Cardamine hirsuta, in which the number of leaflets per leaf increases with age. Although the genetic changes that led to the overall simpler leaf architecture in A. thaliana are increasingly well understood, less is known about the events underlying age-dependent changes within single plants, in either A. thaliana or C. hirsuta. Here, we describe a conserved miRNA transcription factor regulon responsible for an age-dependent increase in leaf complexity. In early leaves, miR319-targeted TCP transcription factors interfere with the function of miR164-dependent and miR164-independent CUC proteins, preventing the formation of serrations in A. thaliana and of leaflets in C. hirsuta. As plants age, accumulation of miR156-regulated SPLs acts as a timing cue that destabilizes TCP-CUC interactions. The destabilization licenses activation of CUC protein complexes and thereby the gradual increase of leaf complexity in the newly formed organs. These findings point to posttranslational interaction between unrelated miRNA-targeted transcription factors as a core feature of these regulatory circuits.
Resumo:
Abrupt climate changes from 18 to 15 thousand years before present (kyr BP) associated with Heinrich Event 1 (HE1) had a strong impact on vegetation patterns not only at high latitudes of the Northern Hemisphere, but also in the tropical regions around the Atlantic Ocean. To gain a better understanding of the linkage between high and low latitudes, we used the University of Victoria (UVic) Earth System-Climate Model (ESCM) with dynamical vegetation and land surface components to simulate four scenarios of climate-vegetation interaction: the pre-industrial era, the Last Glacial Maximum (LGM), and a Heinrich-like event with two different climate backgrounds (interglacial and glacial). We calculated mega-biomes from the plant-functional types (PFTs) generated by the model to allow for a direct comparison between model results and palynological vegetation reconstructions. Our calculated mega-biomes for the pre-industrial period and the LGM corresponded well with biome reconstructions of the modern and LGM time slices, respectively, except that our pre-industrial simulation predicted the dominance of grassland in southern Europe and our LGM simulation resulted in more forest cover in tropical and sub-tropical South America. The HE1-like simulation with a glacial climate background produced sea-surface temperature patterns and enhanced inter-hemispheric thermal gradients in accordance with the "bipolar seesaw" hypothesis. We found that the cooling of the Northern Hemisphere caused a southward shift of those PFTs that are indicative of an increased desertification and a retreat of broadleaf forests in West Africa and northern South America. The mega-biomes from our HE1 simulation agreed well with paleovegetation data from tropical Africa and northern South America. Thus, according to our model-data comparison, the reconstructed vegetation changes for the tropical regions around the Atlantic Ocean were physically consistent with the remote effects of a Heinrich event under a glacial climate background.
Resumo:
"UILU-ENG 79 1716."
Resumo:
We give a simple proof of a formula for the minimal time required to simulate a two-qubit unitary operation using a fixed two-qubit Hamiltonian together with fast local unitaries. We also note that a related lower bound holds for arbitrary n-qubit gates.
The effects of task complexity and practice on dual-task interference in visuospatial working memory
Resumo:
Although the n-back task has been widely applied to neuroimagery investigations of working memory (WM), the role of practice effects on behavioural performance of this task has not yet been investigated. The current study aimed to investigate the effects of task complexity and familiarity on the n-back task. Seventy-seven participants (39 male, 38 female) completed a visuospatial n-back task four times, twice in two testing sessions separated by a week. Participants were required to remember either the first, second or third (n-back) most recent letter positions in a continuous sequence and to indicate whether the current item matched or did not match the remembered position. A control task, with no working memory requirements required participants to match to a predetermined stimulus position. In both testing sessions, reaction time (RT) and error rate increased with increasing WM load. An exponential slope for RTs in the first session indicated dual-task interference at the 3-back level. However, a linear slope in the second session indicated a reduction of dual-task interference. Attenuation of interference in the second session suggested a reduction in executive demands of the task with practice. This suggested that practice effects occur within the n-back ask and need to be controlled for in future neuroimagery research using the task.
Resumo:
The study reported in this article is a part of a large-scale study investigating syntactic complexity in second language (L2) oral data in commonly taught foreign languages (English, German, Japanese, and Spanish; Ortega, Iwashita, Rabie, & Norris, in preparation). In this article, preliminary findings of the analysis of the Japanese data are reported. Syntactic complexity, which is referred to as syntactic maturity or the use of a range of forms with degrees of sophistication (Ortega, 2003), has long been of interest to researchers in L2 writing. In L2 speaking, researchers have examined syntactic complexity in learner speech in the context of pedagogic intervention (e.g., task type, planning time) and the validation of rating scales. In these studies complexity is examined using measures commonly employed in L2 writing studies. It is assumed that these measures are valid and reliable, but few studies explain what syntactic complexity measures actually examine. The language studied is predominantly English, and little is known about whether the findings of such studies can be applied to languages that are typologically different from English. This study examines how syntactic complexity measures relate to oral proficiency in Japanese as a foreign language. An in-depth analysis of speech samples from 33 learners of Japanese is presented. The results of the analysis are compared across proficiency levels and cross-referenced with 3 other proficiency measures used in the study. As in past studies, the length of T-units and the number of clauses per T-unit is found to be the best way to predict learner proficiency; the measure also had a significant linear relation with independent oral proficiency measures. These results are discussed in light of the notion of syntactic complexity and the interfaces between second language acquisition and language testing. Adapted from the source document
Resumo:
Theoretical analyses of air traffic complexity were carried out using the Method for the Analysis of Relational Complexity. Twenty-two air traffic controllers examined static air traffic displays and were required to detect and resolve conflicts. Objective measures of performance included conflict detection time and accuracy. Subjective perceptions of mental workload were assessed by a complexity-sorting task and subjective ratings of the difficulty of different aspects of the task. A metric quantifying the complexity of pair-wise relations among aircraft was able to account for a substantial portion of the variance in the perceived complexity and difficulty of conflict detection problems, as well as reaction time. Other variables that influenced performance included the mean minimum separation between aircraft pairs and the amount of time that aircraft spent in conflict.
Resumo:
Collaborative filtering is regarded as one of the most promising recommendation algorithms. The item-based approaches for collaborative filtering identify the similarity between two items by comparing users' ratings on them. In these approaches, ratings produced at different times are weighted equally. That is to say, changes in user purchase interest are not taken into consideration. For example, an item that was rated recently by a user should have a bigger impact on the prediction of future user behaviour than an item that was rated a long time ago. In this paper, we present a novel algorithm to compute the time weights for different items in a manner that will assign a decreasing weight to old data. More specifically, the users' purchase habits vary. Even the same user has quite different attitudes towards different items. Our proposed algorithm uses clustering to discriminate between different kinds of items. To each item cluster, we trace each user's purchase interest change and introduce a personalized decay factor according to the user own purchase behaviour. Empirical studies have shown that our new algorithm substantially improves the precision of item-based collaborative filtering without introducing higher order computational complexity.
Resumo:
MICE (meetings, incentives, conventions, and exhibitions), has generated high foreign exchange revenue for the economy worldwide. In Thailand, MICE tourists are recognized as ‘quality’ visitors, mainly because of their high-spending potential. Having said that, Thailand’s MICE sector has been influenced by a number of crises following September 11, 2001. Consequently, professionals in the MICE sector must be prepared to deal with such complex phenomena of crisis that might happen in the future. While a number of researches have examined the complexity of crises in the tourism context, there has been little focus on such issues in the MICE sector. As chaos theory provides a particularly good model for crisis situations, it is the aim of this paper to propose a chaos theory-based approach to the understanding of complex and chaotic system of the MICE sector in time of crisis.
Resumo:
We propose a method for the timing analysis of concurrent real-time programs with hard deadlines. We divide the analysis into a machine-independent and a machine-dependent task. The latter takes into account the execution times of the program on a particular machine. Therefore, our goal is to make the machine-dependent phase of the analysis as simple as possible. We succeed in the sense that the machine-dependent phase remains the same as in the analysis of sequential programs. We shift the complexity introduced by concurrency completely to the machine-independent phase.
Resumo:
This paper derives the performance union bound of space-time trellis codes in orthogonal frequency division multiplexing system (STTC-OFDM) over quasi-static frequency selective fading channels based on the distance spectrum technique. The distance spectrum is the enumeration of the codeword difference measures and their multiplicities by exhausted searching through all the possible error event paths. Exhaustive search approach can be used for low memory order STTC with small frame size. However with moderate memory order STTC and moderate frame size the computational cost of exhaustive search increases exponentially, and may become impractical for high memory order STTCs. This requires advanced computational techniques such as Genetic Algorithms (GAS). In this paper, a GA with sharing function method is used to locate the multiple solutions of the distance spectrum for high memory order STTCs. Simulation evaluates the performance union bound and the complexity comparison of non-GA aided and GA aided distance spectrum techniques. It shows that the union bound give a close performance measure at high signal-to-noise ratio (SNR). It also shows that GA sharing function method based distance spectrum technique requires much less computational time as compared with exhaustive search approach but with satisfactory accuracy.
Resumo:
A 21-residue peptide in explicit water has been simulated using classical molecular dynamics. The system's trajectory has been analysed with a novel approach that quantifies the process of how atom's environment trajectories are explored. The approach is based on the measure of Statistical Complexity that extracts complete dynamical information from the signal. The introduced characteristic quantifies the system's dynamics at the nanoseconds time scale. It has been found that the peptide exhibits nanoseconds long periods that significantly differ in the rates of the exploration of the dynamically allowed configurations of the environment. During these periods the rates remain the same but different from other periods and from the rate for water. Periods of dynamical frustration are detected when only limited routes in the space of possible trajectories of the surrounding atoms are realised.