56 resultados para accommodate
Resumo:
Particles of most virus species accurately package a single genome, but there are indications that the pleomorphic particles of parainfluenza viruses incorporate multiple genomes. We characterized a stable measles virus mutant that efficiently packages at least two genomes. The first genome is recombinant and codes for a defective attachment protein with an appended domain interfering with fusion-support function. The second has one adenosine insertion in a purine run that interrupts translation of the appended domain and restores function. In that genome, a one base deletion in a different purine run abolishes polymerase synthesis, but restores hexameric genome length, thus ensuring accurate RNA encapsidation, which is necessary for efficient replication. Thus, the two genomes are complementary. The infection kinetics of this mutant indicate that packaging of multiple genomes does not negatively affect growth. We also show that polyploid particles are produced in standard infections at no expense to infectivity. Our results illustrate how the particles of parainfluenza viruses efficiently accommodate cargoes of different volume, and suggest a mechanism by which segmented genomes may have evolved.
Resumo:
An electron beam ion trap ( EBIT) has been designed and is currently under construction for use in atomic physics experiments at the Queen's University, Belfast. In contrast to traditional EBITs where pairs of superconducting magnets are used, a pair of permanent magnets will be used to compress the electron beam. The permanent magnets have been designed in conjunction with bespoke vacuum ports to give unprecedented access for photon detection. Furthermore, the bespoke vacuum ports facillitate a versatile, reconfigurable trap structure able to accommodate various in-situ detectors and in-line charged particle analysers. Although the machine will have somewhat lower specifications than many existing EBITs in terms of beam current density, it is hoped that the unique features will facilitate a number of hitherto impossible studies involving interactions between electrons and highly charged ions. In this article the new machine's design is outlined along with some suggestions of the type of process to be studied once the construction is completed.
Resumo:
Among the wives of eighteenth-century composers, no one is perhaps more favourably and affectionately described than Bach's second wife, Anna Magdalena (1701-1760). She has been commonly pictured as her husband's trusted assistant, copying his works in handwriting which closely resembled her husband's beautiful calligraphy. No one appreciates her contributions more than today's musicologists, for her copies are usually 'neat and accurate', and are often among the most important primary sources when Bach's autographs do not survive. Occasionally, however, it is difficult to accommodate this patronising view of her role and its significance. It is well known, for instance, that her copy of Bach's Cello Suites (BWV 1007-1012) contains an unusually large number of inaccuracies and copying errors. One must ask how many of these blunders should be ascribed to her. How would a 'neat and accurate' copyist produce such an error-ridden manuscript if she had made it from a fair copy? In this paper, I shall first discuss Anna's copies of Bach's works, and see if any particular patterns or tendencies in her copying activities emerge when these are placed in this broader chronological context. In an attempt to evaluate her performance as a copyist, I shall look at typical situations in which she worked, while at the same time seeking to discover what additional values her copies may bring to our studies of Bach's life and works.
Resumo:
The purpose of this paper is to expose the concept of collaborative planning to the reality of planning, thereby assessing its efficacy for informing and explaining what planners 'really' do and can do. In this systematic appraisal, collaborative planning is disaggregated into four elements that can enlighten such conceptual frameworks: ontology, epistemology, ideology and methodology. These four lenses help delimit and clarify collaborative planning's strengths and weaknesses. The conceptual debate is related to an empirical investigation of planning processes, ranging from region-wide to local and from statutory to visionary in an arena where special care has been invested in participatory deliberation processes. The final analysis provides a systematic gauge of collaborative planning in light of the extensive empirical evidence, deploying the four conceptual dimensions introduced in part one. This exposes a range of problems not only with the concept itself but also regarding its affinity with the uncollaborative world within which it has to operate. The former shed light on those aspects where collaborative planning as a conceptual tool for practitioners needs to be renovated, while the latter highlight inconsistencies in a political framework that struggles to accommodate both global competitiveness and local democratic collaboration.
Resumo:
There is a perception that teaching space in universities is a rather scarce resource. However, some studies have revealed that in many institutions it is actually chronically under-used. Often, rooms are occupied only half the time, and even when in use they are often only half full. This is usually measured by the ‘utilization’ which is defined as the percentage of available ‘seat-hours’ that are employed. Within real institutions, studies have shown that this utilization can often take values as low as 20–40%. One consequence of such a low level of utilization is that space managers are under pressure to make more efficient use of the available teaching space. However, better management is hampered because there does not appear to be a good understanding within space management (near-term planning) of why this happens. This is accompanied, within space planning (long-term planning) by a lack of experise on how best to accommodate the expected low utilizations. This motivates our two main goals: (i) To understand the factors that drive down utilizations, (ii) To set up methods to provide better space planning. Here, we provide quantitative evidence that constraints arising from timetabling and location requirements easily have the potential to explain the low utilizations seen in reality. Furthermore, on considering the decision question ‘Can this given set of courses all be allocated in the available teaching space?’ we find that the answer depends on the associated utilization in a way that exhibits threshold behaviour: There is a sharp division between regions in which the answer is ‘almost always yes’ and those of ‘almost always no’. Through analysis and understanding of the space of potential solutions, our work suggests that better use of space within universities will come about through an understanding of the effects of timetabling constraints and when it is statistically likely that it will be possible for a set of courses to be allocated to a particular space. The results presented here provide a firm foundation for university managers to take decisions on how space should be managed and planned for more effectively. Our multi-criteria approach and new methodology together provide new insight into the interaction between the course timetabling problem and the crucial issue of space planning.
Resumo:
A conventional thin film capacitor heterostructure, consisting of sol-gel deposited lead zirconium titanate (PZT) layers with sputtered platinum top and bottom electrodes, was subjected to fatiguing pulses at a variety of frequencies. The fatigue characteristics were compared to those of a similarly processed capacitor in which a ~20nm tungsten trioxide layer had been deposited, using pulsed laser deposition, between the ferroelectric and upper electrode. The expectation was that, because of its ability to accommodate considerable oxygen non-stoichiometry, tungsten trioxide (WO3) might act as an efficient sink for any oxygen vacancies flushed to the electrode-ferroelectric boundary layer during repetitive switching, and hence would improve the fatigue characteristics of the thin film capacitor. However, it was found that, in general, the addition of tungsten trioxide actually increases the rate of fatigue. It appears that any potential benefit from the WO3, in terms of absorbing oxygen vacancies, is far outweighed by it causing dramatically increased charge injection in the system.
Resumo:
This paper analyses multivariate statistical techniques for identifying and isolating abnormal process behaviour. These techniques include contribution charts and variable reconstructions that relate to the application of principal component analysis (PCA). The analysis reveals firstly that contribution charts produce variable contributions which are linearly dependent and may lead to an incorrect diagnosis, if the number of principal components retained is close to the number of recorded process variables. The analysis secondly yields that variable reconstruction affects the geometry of the PCA decomposition. The paper further introduces an improved variable reconstruction method for identifying multiple sensor and process faults and for isolating their influence upon the recorded process variables. It is shown that this can accommodate the effect of reconstruction, i.e. changes in the covariance matrix of the sensor readings and correctly re-defining the PCA-based monitoring statistics and their confidence limits. (c) 2006 Elsevier Ltd. All rights reserved.
Resumo:
.In this letter, we demonstrate for the first time that gate misalignment is not a critical limiting factor for low voltage operation in gate-underlap double gate (DG) devices. Our results show that underlap architecture significantly extends the tolerable limit of gate misalignment in 25 nm devices. DG MOSFETs with high degree of gate misalignment and optimal gate-underlap design can perform comparably or even better than self-aligned nonunderlap devices. Results show that spacer-to-straggle (s/sigma) ratio, a key design parameter for underlap devices, should be within the range of 2.3-3.0 to accommodate back gate misalignment. These results are very significant as the stringent process control requirements for achieving self-alignment in nanoscale planar DG MOSFETs are considerably relaxed
Resumo:
On the basis of comparative morphology and phylogenetic analyses of rbcL and LSU rDNA sequence data, a new genus, Gayliella gen. nov., is proposed to accommodate the Ceramium flaccidum complex (C. flaccidum, C. byssoideum, C. gracillimum var. byssoideum, and C. taylorii), C. fimbriatum, and a previously undescribed species from Australia. C. transversale is reinstated and recognized as a distinct species. Through this study, G. flaccida (Kutzing) comb. nov., G. transversalis (Collins et Hervey) comb. nov., G. fimbriata (Setchell et N. L. Gardner) comb. nov., G. taylorii comb. nov., G. mazoyerae sp. nov., and G. womersleyi sp. nov. are based on detailed comparative morphology. The species referred to as C. flaccidum and C. dawsonii from Brazil also belong to the new genus. Comparison of Gayliella with Ceramium shows that it differs from the latter by having an alternate branching pattern; three cortical initials per periaxial cell, of which the third is directed basipetally and divides horizontally; and unicellular rhizoids produced from periaxial cells. Our phylogenetic analyses of rbcL and LSU rDNA gene sequence data confirm that Gayliella gen. nov. represents a monophyletic clade distinct from most Ceramium species including the type species, C. virgatum. We also transfer C. recticorticum to the new genus Gayliella.
Resumo:
There is a perception that teaching space in universities is a rather scarce resource. However, some studies have revealed that in many institutions it is actually chronically under-used. Often, rooms are occupied only half the time, and even when in use they are often only half full. This is usually measured by the “utilisation” which is basically the percentage of available ’seat-hours’ that are employed. In real institutions, this utilisation can often takes values as low as 20-40%. One consequence of such low utilisation is that space managers are under pressure to make a more efficient use of the available teaching space. However, better management is hampered because there does not appear to be a good understanding within space management (near-term planning) of why this happens. Nor, a good basis within space planning (long-term planning) of how best to accommodate the expected low utilisations. This motivates our two main goals: (i) To understand the factors that drive down utilisations, (ii) To set up methods to provide better space planning. Here, we provide quantitative evidence that constraints arising from timetabling and location requirements easily have the potential to explain the low utilisations seen in reality. Furthermore, on considering the decision question “Can this given set of courses all be allocated in the available teaching space?” we find that the answer depends on the associated utilisation in a way that exhibits threshold behaviour: There is a sharp division between regions in which the answer is “almost always yes” and those of “almost always no”. Our work suggests that progress in space management and planning will arise from an integrated approach; combining purely space issues with restrictions representing an aggregated or abstracted version of key constraints such as timetabling or location, and
Resumo:
Almost free-standing single crystal mesoscale and nanoscale dots of ferroelectric BaTiO3 have been made by direct focused ion beam patterning of bulk single crystal material. The domain structures which appear in these single crystal dots, after cooling through the Curie temperature, were observed to form into quadrants, with each quadrant consisting of fine 90° stripe domains. The reason that these rather complex domain configurations form is uncertain, but we consider and discuss three possibilities for their genesis: first, that the quadrant features initially form to facilitate field-closure, but then develop 90° shape compensating stripe domains in order to accommodate disclination stresses; second, that they are the result of the impingement of domain packets which nucleate at the sidewalls of the dots forming “Forsbergh” patterns (essentially the result of phase transition kinetics); and third, that 90° domains form to conserve the shape of the nanodot as it is cooled through the Curie temperature but arrange into quadrant packets in order to minimize the energy associated with uncompensated surface charges (thus representing an equilibrium state). While the third model is the preferred one, we note that the second and third models are not mutually exclusive.
Resumo:
A graphical method is presented for determining the capability of individual system nodes to accommodate wind power generation. The method is based upon constructing a capability chart for each node at which a wind farm is to be connected. The capability chart defines the domain of allowable power injections at the candidate node, subject to constraints imposed by voltage limits, voltage stability and equipment capability limits being satisfied. The chart is first derived for a two-bus model, before being extended to a multi-node power system. The graphical method is employed to derive the chart for a two-node system, as well as its application to a multi-node power system, considering the IEEE 30-bus test system as a case study. Although the proposed method is derived with the intention of determining the wind farm capacity to be connected at a specific node, it can be used for the analysis of a PQ bus loading as well as generation.
Resumo:
This paper proposes a new hierarchical learning structure, namely the holistic triple learning (HTL), for extending the binary support vector machine (SVM) to multi-classification problems. For an N-class problem, a HTL constructs a decision tree up to a depth of A leaf node of the decision tree is allowed to be placed with a holistic triple learning unit whose generalisation abilities are assessed and approved. Meanwhile, the remaining nodes in the decision tree each accommodate a standard binary SVM classifier. The holistic triple classifier is a regression model trained on three classes, whose training algorithm is originated from a recently proposed implementation technique, namely the least-squares support vector machine (LS-SVM). A major novelty with the holistic triple classifier is the reduced number of support vectors in the solution. For the resultant HTL-SVM, an upper bound of the generalisation error can be obtained. The time complexity of training the HTL-SVM is analysed, and is shown to be comparable to that of training the one-versus-one (1-vs.-1) SVM, particularly on small-scale datasets. Empirical studies show that the proposed HTL-SVM achieves competitive classification accuracy with a reduced number of support vectors compared to the popular 1-vs-1 alternative.
Resumo:
Relevance theory (Sperber & Wilson. 1995) suggests that people expend cognitive effort when processing information in proportion to the cognitive effects to be gained from doing so. This theory has been used to explain how people apply their knowledge appropriately when evaluating category-based inductive arguments (Medin, Coley, Storms, & Hayes, 2003). In such arguments, people are told that a property is true of premise categories and are asked to evaluate the likelihood that it is also true of conclusion categories. According to the relevance framework, reasoners generate hypotheses about the relevant relation between the categories in the argument. We reasoned that premises inconsistent with early hypotheses about the relevant relation would have greater effects than consistent premises. We designed three premise garden-path arguments where the same 3rd premise was either consistent or inconsistent with likely hypotheses about the relevant relation. In Experiments 1 and 2, we showed that effort expended processing consistent premises (measured via reading times) was significantly less than effort expended on inconsistent premises. In Experiment 2 and 3, we demonstrated a direct relation between cognitive effect and cognitive effort. For garden-path arguments, belief change given inconsistent 3rd premises was significantly correlated with Premise 3 (Experiment 3) and conclusion (Experiments 2 and 3) reading times. For consistent arguments, the correlation between belief change and reading times did not approach significance. These results support the relevance framework for induction but are difficult to accommodate under other approaches.