26 resultados para Parallel design multicenter
Resumo:
Statistical approaches have been applied to examine amino acid pairing preferences within parallel beta-sheets. The main chain hydrogen bonding pattern in parallel beta-sheets means that, for each residue pair, only one of the residues is involved in main chain hydrogen bonding with the strand containing the partner residue. We call this the hydrogen bonded (HB) residue and the partner residue the non-hydrogen bonded (nHB) residue, and differentiate between the favourability of a pair and that of its reverse pair, e.g. Asn(HB)-Thr(nHB) versus Thr(HB)-Asn(nHB). Significantly (p <= 0.000001) favoured pairings were rationalised using stereochemical arguments. For instance, Asn(HB)-Thr(nHB) and Arg(HB)-Thr(nHB) were favoured pairs, where the residues adopted favoured chi(1) rotamer positions that allowed side-chain interactions to occur. In contrast, Thr(HB)-Asn(nHB) and Thr(HB)-Arg(nHB) were not significantly favoured, and could only form side-chain interactions if the residues involved adopted less favourable chi(1) conformations. The favourability of hydrophobic pairs e.g. Ile(HB)-Ile(nHB), Val(HB)-Val(nHB) and Leu(HB)-Ile(nHB) was explained by the residues adopting their most preferred chi(1) and chi(2) conformations, which enabled them to form nested arrangements. Cysteine-cysteine pairs are significantly favoured, although these do not form intrasheet disulphide bridges. Interactions between positively and negatively charged residues were asymmetrically preferred: those with the negatively charged residue at the HB position were more favoured. This trend was accounted for by the presence of general electrostatic interactions, which, based on analysis of distances between charged atoms, were likely to be stronger when the negatively charged residue is the HB partner. The Arg(HB)-Asp(nHB) interaction was an exception to this trend and its favourability was rationalised by the formation of specific side-chain interactions. This research provides rules that could be applied to protein structure prediction, comparative modelling and protein engineering and design. The methods used to analyse the pairing preferences are automated and detailed results are available (http:// www.rubic.rdg.ac.uk/betapairprefsparallel/). (c) 2005 Elsevier Ltd. All rights reserved.
Resumo:
The expression of proteins using recombinant baculoviruses is a mature and widely used technology. However, some aspects of the technology continue to detract from high throughput use and the basis of the final observed expression level is poorly understood. Here, we describe the design and use of a set of vectors developed around a unified cloning strategy that allow parallel expression of target proteins in the baculovirus system as N-terminal or C-terminal fusions. Using several protein kinases as tests we found that amino-terminal fusion to maltose binding protein rescued expression of the poorly expressed human kinase Cot but had only a marginal effect on expression of a well-expressed kinase IKK-2. In addition, MBP fusion proteins were found to be secreted from the expressing cell. Use of a carboxyl-terminal GFP tagging vector showed that fluorescence measurement paralleled expression level and was a convenient readout in the context of insect cell expression, an observation that was further supported with additional non-kinase targets. The expression of the target proteins using the same vectors in vitro showed that differences in expression level were wholly dependent on the environment of the expressing cell and an investigation of the time course of expression showed it could affect substantially the observed expression level for poorly but not well-expressed proteins. Our vector suite approach shows that rapid expression survey can be achieved within the baculovirus system and in addition, goes some way to identifying the underlying basis of the expression level obtained. (c) 2006 Elsevier Inc. All rights reserved.
Resumo:
The design space of emerging heterogenous multi-core architectures with re-configurability element makes it feasible to design mixed fine-grained and coarse-grained parallel architectures. This paper presents a hierarchical composite array design which extends the curret design space of regular array design by combining a sequence of transformations. This technique is applied to derive a new design of a pipelined parallel regular array with different dataflow between phases of computation.
Resumo:
As consumers demand more functionality) from their electronic devices and manufacturers supply the demand then electrical power and clock requirements tend to increase, however reassessing system architecture can fortunately lead to suitable counter reductions. To maintain low clock rates and therefore reduce electrical power, this paper presents a parallel convolutional coder for the transmit side in many wireless consumer devices. The coder accepts a parallel data input and directly computes punctured convolutional codes without the need for a separate puncturing operation while the coded bits are available at the output of the coder in a parallel fashion. Also as the computation is in parallel then the coder can be clocked at 7 times slower than the conventional shift-register based convolutional coder (using DVB 7/8 rate). The presented coder is directly relevant to the design of modern low-power consumer devices
Resumo:
Purpose – The purpose of this paper is to consider Turing's two tests for machine intelligence: the parallel-paired, three-participants game presented in his 1950 paper, and the “jury-service” one-to-one measure described two years later in a radio broadcast. Both versions were instantiated in practical Turing tests during the 18th Loebner Prize for artificial intelligence hosted at the University of Reading, UK, in October 2008. This involved jury-service tests in the preliminary phase and parallel-paired in the final phase. Design/methodology/approach – Almost 100 test results from the final have been evaluated and this paper reports some intriguing nuances which arose as a result of the unique contest. Findings – In the 2008 competition, Turing's 30 per cent pass rate is not achieved by any machine in the parallel-paired tests but Turing's modified prediction: “at least in a hundred years time” is remembered. Originality/value – The paper presents actual responses from “modern Elizas” to human interrogators during contest dialogues that show considerable improvement in artificial conversational entities (ACE). Unlike their ancestor – Weizenbaum's natural language understanding system – ACE are now able to recall, share information and disclose personal interests.
Resumo:
Both the (5,3) counter and (2,2,3) counter multiplication techniques are investigated for the efficiency of their operation speed and the viability of the architectures when implemented in a fast bipolar ECL technology. The implementation of the counters in series-gated ECL and threshold logic are contrasted for speed, noise immunity and complexity, and are critically compared with the fastest practical design of a full-adder. A novel circuit technique to overcome the problems of needing high fan-in input weights in threshold circuits through the use of negative weighted inputs is presented. The authors conclude that a (2,2,3) counter based array multiplier implemented in series-gated ECL should enable a significant increase in speed over conventional full adder based array multipliers.
Resumo:
The authors compare various array multiplier architectures based on (p,q) counter circuits. The tradeoff in multiplier design is always between adding complexity and increasing speed. It is shown that by using a (2,2,3) counter cell it is possible to gain a significant increase in speed over a conventional full-adder, carry-save array based approach. The increase in complexity should be easily accommodated using modern emitter-coupled-logic processes.
Resumo:
A great number of studies on wind conditions in passages between slab-type buildings have been conducted in the past. However, wind conditions under different structure and configuration of buildings is still unclear and studies existed still can’t provide guidance on urban planning and design, due to the complexity of buildings and aerodynamics. The aim of this paper is to provide more insight in the mechanism of wind conditions in passages. In this paper, a simplified passage model with non-parallel buildings is developed on the basis of the wind tunnel experiments conducted by Blocken et al. (2008). Numerical simulation based on CFD is employed for a detailed investigation of the wind environment in passages between two long narrow buildings with different directions and model validation is performed by comparing numerical results with corresponding wind tunnel measurements.
Resumo:
A recent study conducted by Blocken et al. (Numerical study on the existence of the Venturi effect in passages between perpendicular buildings. Journal of Engineering Mechanics, 2008,134: 1021-1028) challenged the popular view of the existence of the ‘Venturi effect’ in building passages as the wind is exposed to an open boundary. The present research extends the work of Blocken et al. (2008a) into a more general setup with the building orientation varying from 0° to 180° using CFD simulations. Our results reveal that the passage flow is mainly determined by the combination of corner streams. It is also shown that converging passages have a higher wind-blocking effect compared to diverging passages, explained by a lower wind speed and higher drag coefficient. Fluxes on the top plane of the passage volume reverse from outflow to inflow in the cases of α=135°, 150° and 165°. A simple mathematical expression to explain the relationship between the flux ratio and the geometric parameters has been developed to aid wind design in an urban neighborhood. In addition, a converging passage with α=15° is recommended for urban wind design in cold and temperate climates since the passage flow changes smoothly and a relatively lower wind speed is expected compared with that where there are no buildings. While for the high-density urban area in (sub)tropical climates such as Hong Kong where there is a desire for more wind, a diverging passage with α=150° is a better choice to promote ventilation at the pedestrian level.
Resumo:
The use of economic incentives for biodiversity (mostly Compensation and Reward for Environmental Services including Payment for ES) has been widely supported in the past decades and became the main innovative policy tools for biodiversity conservation worldwide. These policy tools are often based on the insight that rational actors perfectly weigh the costs and benefits of adopting certain behaviors and well-crafted economic incentives and disincentives will lead to socially desirable development scenarios. This rationalist mode of thought has provided interesting insights and results, but it also misestimates the context by which ‘real individuals’ come to decisions, and the multitude of factors influencing development sequences. In this study, our goal is to examine how these policies can take advantage of some unintended behavioral reactions that might in return impact, either positively or negatively, general policy performances. We test the effect of income's origin (‘Low effort’ based money vs. ‘High effort’ based money) on spending decisions (Necessity vs. Superior goods) and subsequent pro social preferences (Future pro-environmental behavior) within Madagascar rural areas, using a natural field experiment. Our results show that money obtained under low effort leads to different consumption patterns than money obtained under high efforts: superior goods are more salient in the case of low effort money. In parallel, money obtained under low effort leads to subsequent higher pro social behavior. Compensation and rewards policies for ecosystem services may mobilize knowledge on behavioral biases to improve their design and foster positive spillovers on their development goals.
Resumo:
This is a study of graphic information designed for Future Books/Future magazine (UK) and Fortune magazine (USA) in the years immediately after the Second World War. It highlights work made by the Isotype Institute for Future, which is then situated against contributions by Abram Games and F. H. K. Henrion. Similar work in Fortune under the art editorship of Will Burtin is discussed in a parallel account, drawing on examples by him and by others including György Kepes, Matthew Liebowitz, Alex Steinweiss and Ladislav Sutnar. Attention is drawn to links and relationships between to the two periodicals and the graphic information published in both. Further comparisons are made between underlying editorial and design strategies pursued by Otto Neurath (Isotype Institute) and Will Burtin. An argument is made for recognising the little-known innovations of Future alongside the long-acknowledged innovations of Fortune.