872 resultados para High Performance Computing
Resumo:
This paper describes the design and manufacture of a set of precision cooled (210K) narrow-bandpass filters for the infrared imager and sounder on the Indian Space Research Organisation (ISRO) INSAT-3D meteorological satellite. We discuss the basis for the choice of multilayer coating designs and materials for 21 differing filter channels, together with their temperature-dependence, thin film deposition technologies, substrate metrology, and environmental durability performance. (C) 2008 Optical Society of America.
Resumo:
The High Resolution Dynamics Limb Sounder is described, with particular reference to the atmospheric measurements to be made and the rationale behind the measurement strategy. The demands this strategy places on the filters to be used in the instrument and the designs to which this leads to are described. A second set of filters at an intermediate image plane to reduce "Ghost Imaging" is discussed together with their required spectral properties. A method of combining the spectral characteristics of the primary and secondary filters in each channel are combined together with the spectral response of the detectors and other optical elements to obtain the system spectral response weighted appropriately for the Planck function and atmospheric limb absorption. This method is used to demonstrate whether the out-of-band spectral blocking requirement for a channel is being met and an example calculation is demonstrated showing how the blocking is built up for a representative channel. Finally, the techniques used to produce filters of the necessary sub-millimetre sizes together with the testing methods and procedures used to assess the environmental durability and establish space flight quality are discussed.
Resumo:
Infrared optical-multilayer filters and materials were exposed to the space environment of low Earth orbit on LDEF. This paper summarizes the effects of that environment on the physical and optical properties of the filters and materials flown.
Resumo:
Infrared multilayer interference filters have been used extensively in satellite radiometers for about 15 years. Filters manufactured by the University of Reading have been used in Nimbus 5, 6, and 7, TIROS N, and the Pioneer Venus orbiter. The ability of the filters to withstand the space environment in these applications is critical; if degradation takes place, the effects would range from worsening of signal-to-noise performance to complete system failure. An experiment on the LDEF will enable the filters, for the first time, to be subjected to authoritative spectral measurements following space exposure to ascertain their suitability for spacecraft use and to permit an understanding of degradation mechanisms.
Resumo:
This study was undertaken to explore gel permeation chromatography (GPC) for estimating molecular weights of proanthocyanidin fractions isolated from sainfoin (Onobrychis viciifolia). The results were compared with data obtained by thiolytic degradation of the same fractions. Polystyrene, polyethylene glycol and polymethyl methacrylate standards were not suitable for estimating the molecular weights of underivatized proanthocyanidins. Therefore, a novel HPLC-GPC method was developed based on two serially connected PolarGel-L columns using DMF that contained 5% water, 1% acetic acid and 0.15 M LiBr at 0.7 ml/min and 50 degrees C. This yielded a single calibration curve for galloyl glucoses (trigalloyl glucose, pentagalloyl glucose), ellagitannins (pedunculagin, vescalagin, punicalagin, oenothein B, gemin A), proanthocyanidins (procyanidin B2, cinnamtannin B1), and several other polyphenols (catechin, epicatechin gallate, epicallocatechin gallate, amentoflavone). These GPC predicted molecular weights represented a considerable advance over previously reported HPLC-GPC methods for underivatized proanthocyanidins. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
Polymer-stabilised liquid crystals are systems in which a small amount of monomer is dissolved within a liquid crystalline host, and then polymerised in situ to produce a network. The progress of the polymerisation, performed within electro-optic cells, was studied by establishing an analytical method novel to these systems. Samples were prepared by photopolymerisation of the monomer under well-defined reaction conditions; subsequent immersion in acetone caused the host and any unreacted monomer to dissolve. High performance liquid chromatography was used to separate and detect the various solutes in the resulting solutions, enabling the amount of unreacted monomer for a given set of conditions to be quantified. Longer irradiations cause a decrease in the proportion of unreacted monomer since more network is formed, while a more uniform LC director alignment (achieved by decreasing the sample thickness) or a higher level of order (achieved by decreasing the polymerisation temperature) promotes faster reactions.
Resumo:
Active robot force control requires some form of dynamic inner loop control for stability. The author considers the implementation of position-based inner loop control on an industrial robot fitted with encoders only. It is shown that high gain velocity feedback for such a robot, which is effectively stationary when in contact with a stiff environment, involves problems beyond the usual caveats on the effects of unknown environment stiffness. It is shown that it is possible for the controlled joint to become chaotic at very low velocities if encoder edge timing data are used for velocity measurement. The results obtained indicate that there is a lower limit on controlled velocity when encoders are the only means of joint measurement. This lower limit to speed is determined by the desired amount of loop gain, which is itself determined by the severity of the nonlinearities present in the drive system.
Resumo:
The impending threat of global climate change and its regional manifestations is among the most important and urgent problems facing humanity. Society needs accurate and reliable estimates of changes in the probability of regional weather variations to develop science-based adaptation and mitigation strategies. Recent advances in weather prediction and in our understanding and ability to model the climate system suggest that it is both necessary and possible to revolutionize climate prediction to meet these societal needs. However, the scientific workforce and the computational capability required to bring about such a revolution is not available in any single nation. Motivated by the success of internationally funded infrastructure in other areas of science, this paper argues that, because of the complexity of the climate system, and because the regional manifestations of climate change are mainly through changes in the statistics of regional weather variations, the scientific and computational requirements to predict its behavior reliably are so enormous that the nations of the world should create a small number of multinational high-performance computing facilities dedicated to the grand challenges of developing the capabilities to predict climate variability and change on both global and regional scales over the coming decades. Such facilities will play a key role in the development of next-generation climate models, build global capacity in climate research, nurture a highly trained workforce, and engage the global user community, policy-makers, and stakeholders. We recommend the creation of a small number of multinational facilities with computer capability at each facility of about 20 peta-flops in the near term, about 200 petaflops within five years, and 1 exaflop by the end of the next decade. Each facility should have sufficient scientific workforce to develop and maintain the software and data analysis infrastructure. Such facilities will enable questions of what resolution, both horizontal and vertical, in atmospheric and ocean models, is necessary for more confident predictions at the regional and local level. Current limitations in computing power have placed severe limitations on such an investigation, which is now badly needed. These facilities will also provide the world's scientists with the computational laboratories for fundamental research on weather–climate interactions using 1-km resolution models and on atmospheric, terrestrial, cryospheric, and oceanic processes at even finer scales. Each facility should have enabling infrastructure including hardware, software, and data analysis support, and scientific capacity to interact with the national centers and other visitors. This will accelerate our understanding of how the climate system works and how to model it. It will ultimately enable the climate community to provide society with climate predictions, which are based on our best knowledge of science and the most advanced technology.
Resumo:
Denaturing high-performance liquid chromatography (DHPLC) was evaluated as a rapid screening and identification method for DNA sequence variation detection in the quinolone resistance-determining region of gyrA from Salmonella serovars. A total of 203 isolates of Salmonella were screened using this method. DHPLC analysis of 14 isolates representing each type of novel or multiple mutations and the wild type were compared with LightCycler-based PCR-gyrA hybridization mutation assay (GAMA) and single-strand conformational polymorphism (SSCP) analyses. The 14 isolates gave seven different SSCP patterns, and LightCycler detected four different mutations. DHPLC detected 11 DNA sequence variants at eight different codons, including those detected by LightCycler or SSCP. One of these mutations was silent. Five isolates contained multiple mutations, and four of these could be distinguished from the composite sequence variants by their DHPLC profile. Seven novel mutations were identified at five different loci not previously described in quinolone-resistant salmonella. DHPLC analysis proved advantageous for the detection of novel and multiple mutations. DHPLC also provides a rapid, high-throughput alternative to LightCycler and SSCP for screening frequently occurring mutations.
Resumo:
Aims: Quinolone antibiotics are the agents of choice for treating systemic Salmonella infections. Resistance to quinolones is usually mediated by mutations in the DNA gyrase gene gyrA. Here we report the evaluation of standard HPLC equipment for the detection of mutations (single nucleotide polymorphisms; SNPs) in gyrA, gyrB, parC and parE by denaturing high performance liquid chromatography (DHPLC). Methods: A panel of Salmonella strains was assembled which comprised those with known different mutations in gyrA (n = 8) and fluoroquinolone-susceptible and -resistant strains (n = 50) that had not been tested for mutations in gyrA. Additionally, antibiotic-susceptible strains of serotypes other than Salmonella enterica serovar Typhimurium strains were examined for serotype-specific mutations in gyrB (n = 4), parC (n = 6) and parE (n = 1). Wild-type (WT) control DNA was prepared from Salmonella Typhimurium NCTC 74. The DNA of respective strains was amplified by PCR using Optimase (R) proofreading DNA polymerase. Duplex DNA samples were analysed using an Agilent A1100 HPLC system with a Varian Helix (TM) DNA column. Sequencing was used to validate mutations detected by DHPLC in the strains with unknown mutations. Results: Using this HPLC system, mutations in gyrA, gyrB, parC and parE were readily detected by comparison with control chromatograms. Sequencing confirmed the gyrA predicted mutations as detected by DHPLC in the unknown strains and also confirmed serotype-associated sequence changes in non-Typhimurium serotypes. Conclusions: The results demonstrated that a non-specialist standard HPLC machine fitted with a generally available column can be used to detect SNPs in gyrA, gyrB, parC and parE genes by DHPLC. Wider applications should be possible.
Resumo:
The use of virtualization in high-performance computing (HPC) has been suggested as a means to provide tailored services and added functionality that many users expect from full-featured Linux cluster environments. The use of virtual machines in HPC can offer several benefits, but maintaining performance is a crucial factor. In some instances the performance criteria are placed above the isolation properties. This selective relaxation of isolation for performance is an important characteristic when considering resilience for HPC environments that employ virtualization. In this paper we consider some of the factors associated with balancing performance and isolation in configurations that employ virtual machines. In this context, we propose a classification of errors based on the concept of “error zones”, as well as a detailed analysis of the trade-offs between resilience and performance based on the level of isolation provided by virtualization solutions. Finally, a set of experiments are performed using different virtualization solutions to elucidate the discussion.
Resumo:
Performance modelling is a useful tool in the lifeycle of high performance scientific software, such as weather and climate models, especially as a means of ensuring efficient use of available computing resources. In particular, sufficiently accurate performance prediction could reduce the effort and experimental computer time required when porting and optimising a climate model to a new machine. In this paper, traditional techniques are used to predict the computation time of a simple shallow water model which is illustrative of the computation (and communication) involved in climate models. These models are compared with real execution data gathered on AMD Opteron-based systems, including several phases of the U.K. academic community HPC resource, HECToR. Some success is had in relating source code to achieved performance for the K10 series of Opterons, but the method is found to be inadequate for the next-generation Interlagos processor. The experience leads to the investigation of a data-driven application benchmarking approach to performance modelling. Results for an early version of the approach are presented using the shallow model as an example.