759 resultados para Spectroscopy computing
Resumo:
The impending threat of global climate change and its regional manifestations is among the most important and urgent problems facing humanity. Society needs accurate and reliable estimates of changes in the probability of regional weather variations to develop science-based adaptation and mitigation strategies. Recent advances in weather prediction and in our understanding and ability to model the climate system suggest that it is both necessary and possible to revolutionize climate prediction to meet these societal needs. However, the scientific workforce and the computational capability required to bring about such a revolution is not available in any single nation. Motivated by the success of internationally funded infrastructure in other areas of science, this paper argues that, because of the complexity of the climate system, and because the regional manifestations of climate change are mainly through changes in the statistics of regional weather variations, the scientific and computational requirements to predict its behavior reliably are so enormous that the nations of the world should create a small number of multinational high-performance computing facilities dedicated to the grand challenges of developing the capabilities to predict climate variability and change on both global and regional scales over the coming decades. Such facilities will play a key role in the development of next-generation climate models, build global capacity in climate research, nurture a highly trained workforce, and engage the global user community, policy-makers, and stakeholders. We recommend the creation of a small number of multinational facilities with computer capability at each facility of about 20 peta-flops in the near term, about 200 petaflops within five years, and 1 exaflop by the end of the next decade. Each facility should have sufficient scientific workforce to develop and maintain the software and data analysis infrastructure. Such facilities will enable questions of what resolution, both horizontal and vertical, in atmospheric and ocean models, is necessary for more confident predictions at the regional and local level. Current limitations in computing power have placed severe limitations on such an investigation, which is now badly needed. These facilities will also provide the world's scientists with the computational laboratories for fundamental research on weather–climate interactions using 1-km resolution models and on atmospheric, terrestrial, cryospheric, and oceanic processes at even finer scales. Each facility should have enabling infrastructure including hardware, software, and data analysis support, and scientific capacity to interact with the national centers and other visitors. This will accelerate our understanding of how the climate system works and how to model it. It will ultimately enable the climate community to provide society with climate predictions, which are based on our best knowledge of science and the most advanced technology.
Resumo:
Pocket Data Mining (PDM) is our new term describing collaborative mining of streaming data in mobile and distributed computing environments. With sheer amounts of data streams are now available for subscription on our smart mobile phones, the potential of using this data for decision making using data stream mining techniques has now been achievable owing to the increasing power of these handheld devices. Wireless communication among these devices using Bluetooth and WiFi technologies has opened the door wide for collaborative mining among the mobile devices within the same range that are running data mining techniques targeting the same application. This paper proposes a new architecture that we have prototyped for realizing the significant applications in this area. We have proposed using mobile software agents in this application for several reasons. Most importantly the autonomic intelligent behaviour of the agent technology has been the driving force for using it in this application. Other efficiency reasons are discussed in details in this paper. Experimental results showing the feasibility of the proposed architecture are presented and discussed.
Resumo:
The P-found protein folding and unfolding simulation repository is designed to allow scientists to perform analyses across large, distributed simulation data sets. There are two storage components in P-found: a primary repository of simulation data and a data warehouse. Here we demonstrate how grid technologies can support multiple, distributed P-found installations. In particular we look at two aspects, first how grid data management technologies can be used to access the distributed data warehouses; and secondly, how the grid can be used to transfer analysis programs to the primary repositories --- this is an important and challenging aspect of P-found because the data volumes involved are too large to be centralised. The grid technologies we are developing with the P-found system will allow new large data sets of protein folding simulations to be accessed and analysed in novel ways, with significant potential for enabling new scientific discoveries.
Resumo:
A novel combination of site-specific isotope labelling, polarised infrared spectroscopy and molecular combing reveal local orientational ordering in the fibril-forming peptide YTIAALLSPYSGGRADS. Use of 13C-18O labelled alanine residues demonstrates that the Nterminal end of the peptide is incorporated into the cross-beta structure, while the C-terminal end shows orientational disorder
Resumo:
Purpose: This paper aims to design an evaluation method that enables an organization to assess its current IT landscape and provide readiness assessment prior to Software as a Service (SaaS) adoption. Design/methodology/approach: The research employs a mixed of quantitative and qualitative approaches for conducting an IT application assessment. Quantitative data such as end user’s feedback on the IT applications contribute to the technical impact on efficiency and productivity. Qualitative data such as business domain, business services and IT application cost drivers are used to determine the business value of the IT applications in an organization. Findings: The assessment of IT applications leads to decisions on suitability of each IT application that can be migrated to cloud environment. Research limitations/implications: The evaluation of how a particular IT application impacts on a business service is done based on the logical interpretation. Data mining method is suggested in order to derive the patterns of the IT application capabilities. Practical implications: This method has been applied in a local council in UK. This helps the council to decide the future status of the IT applications for cost saving purpose.
Resumo:
The chemical specificity of terahertz spectroscopy, when combined with techniques for sub-wavelength sensing, is giving new understanding of processes occurring at the nanometre scale in biological systems and offers the potential for single molecule detection of chemical and biological agents and explosives. In addition, terahertz techniques are enabling the exploration of the fundamental behaviour of light when it interacts with nanoscale optical structures, and are being used to measure ultrafast carrier dynamics, transport and localisation in nanostructures. This chapter will explain how terahertz scale modelling can be used to explore the fundamental physics of nano-optics, it will discuss the terahertz spectroscopy of nanomaterials, terahertz near-field microscopy and other sub-wavelength techniques, and summarise recent developments in the terahertz spectroscopy and imaging of biological systems at the nanoscale. The potential of using these techniques for security applications will be considered.
Resumo:
Traditionally functional magnetic resonance imaging (fMRI) has been used to map activity in the human brain by measuring increases in the Blood Oxygenation Level Dependent (BOLD) signal. Often accompanying positive BOLD fMRI signal changes are sustained negative signal changes. Previous studies investigating the neurovascular coupling mechanisms of the negative BOLD phenomenon have used concurrent 2D-optical imaging spectroscopy (2D-OIS) and electrophysiology (Boorman et al., 2010). These experiments suggested that the negative BOLD signal in response to whisker stimulation was a result of an increase in deoxy-haemoglobin and reduced multi-unit activity in the deep cortical layers. However, Boorman et al. (2010) did not measure the BOLD and haemodynamic response concurrently and so could not quantitatively compare either the spatial maps or the 2D-OIS and fMRI time series directly. Furthermore their study utilised a homogeneous tissue model in which is predominantly sensitive to haemodynamic changes in more superficial layers. Here we test whether the 2D-OIS technique is appropriate for studies of negative BOLD. We used concurrent fMRI with 2D-OIS techniques for the investigation of the haemodynamics underlying the negative BOLD at 7 Tesla. We investigated whether optical methods could be used to accurately map and measure the negative BOLD phenomenon by using 2D-OIS haemodynamic data to derive predictions from a biophysical model of BOLD signal changes. We showed that despite the deep cortical origin of the negative BOLD response, if an appropriate heterogeneous tissue model is used in the spectroscopic analysis then 2D-OIS can be used to investigate the negative BOLD phenomenon.
Resumo:
In this paper we describe a novel combination of Raman spectroscopy, isotope editing and X-ray scattering as a powerful approach to give detailed structural information on aromatic side chains in peptide fibrils. The orientation of the tyrosine residues in fibrils of the peptide YTIAALLSPYS with respect to the fibril axis has been determined from a combination of polarised Raman spectroscopy and X-ray diffraction measurements. The Raman intensity of selected tyrosine bands collected at different polarisation geometries is related to the values and orientation of the Raman tensor for those specific vibrations. Using published Raman tensor values we solved the relevant expressions for both of the two tyrosine residues present in this peptide. Ring deuteration in one of the two tyrosine side chains allowed for the calculation to be performed individually for both, by virtue of the isotopic shift that eliminates band overlapping. Sample disorder was taken into account by obtaining the distribution of orientations of the samples from X-ray diffraction experiments. The results provide previously unavailable details about the molecular conformation of this peptide, and demonstrate the value of this approach for the study of amyloid fibrils.
Resumo:
Using high-time-resolution (72 ms) spectroscopy of AE Aqr obtained with LRIS on Keck II we have determined the spectrum and spectral evolution of a small flare. Continuum and integrated line fluxes in the flare spectrum are measured, and the evolution of the flare is parametrized for future comparison with detailed models of the flares. We find that the velocities of the flaring components are consistent with those previously reported for AE Aqr by Welsh, Horne & Gomer and Horne. The characteristics of the 33-s oscillations are investigated: we derive the oscillation amplitude spectrum, and from that determine the spectrum of the heated regions on the rotating white dwarf. Blackbody fits to the major and minor pulse spectra and an analysis of the emission-line oscillation properties highlight the shortfalls in the simple hotspot model for the oscillations.
Resumo:
We present an analysis of Rapid Keck Spectroscopy of the CVs AM Her (polar) and SS Cyg (dwarf nova). We decompose the spectra into constant and variable components and identify different types of variability in AM Her with different characteristic timescales. The variable flickering component of the accretion disc flux and the observational characteristics of a small flare in SS Cyg are isolated.
Resumo:
We present fast (72 ms) spectroscopy of AM Her obtained at an intermediate brightness state just before a rise to high state. Interesting features in the line behaviour of AM Her are noted and the variability spectrum is presented and compared to that of SS Cyg.
Resumo:
We have optimised the atmospheric radiation algorithm of the FAMOUS climate model on several hardware platforms. The optimisation involved translating the Fortran code to C and restructuring the algorithm around the computation of a single air column. Instead of the existing MPI-based domain decomposition, we used a task queue and a thread pool to schedule the computation of individual columns on the available processors. Finally, four air columns are packed together in a single data structure and computed simultaneously using Single Instruction Multiple Data operations. The modified algorithm runs more than 50 times faster on the CELL’s Synergistic Processing Elements than on its main PowerPC processing element. On Intel-compatible processors, the new radiation code runs 4 times faster. On the tested graphics processor, using OpenCL, we find a speed-up of more than 2.5 times as compared to the original code on the main CPU. Because the radiation code takes more than 60% of the total CPU time, FAMOUS executes more than twice as fast. Our version of the algorithm returns bit-wise identical results, which demonstrates the robustness of our approach. We estimate that this project required around two and a half man-years of work.
The Impact of office productivity cloud computing on energy consumption and greenhouse gas emissions
Resumo:
Cloud computing is usually regarded as being energy efficient and thus emitting less greenhouse gases (GHG) than traditional forms of computing. When the energy consumption of Microsoft’s cloud computing Office 365 (O365) and traditional Office 2010 (O2010) software suites were tested and modeled, some cloud services were found to consume more energy than the traditional form. The developed model in this research took into consideration the energy consumption at the three main stages of data transmission; data center, network, and end user device. Comparable products from each suite were selected and activities were defined for each product to represent a different computing type. Microsoft provided highly confidential data for the data center stage, while the networking and user device stages were measured directly. A new measurement and software apportionment approach was defined and utilized allowing the power consumption of cloud services to be directly measured for the user device stage. Results indicated that cloud computing is more energy efficient for Excel and Outlook which consumed less energy and emitted less GHG than the standalone counterpart. The power consumption of the cloud based Outlook (8%) and Excel (17%) was lower than their traditional counterparts. However, the power consumption of the cloud version of Word was 17% higher than its traditional equivalent. A third mixed access method was also measured for Word which emitted 5% more GHG than the traditional version. It is evident that cloud computing may not provide a unified way forward to reduce energy consumption and GHG. Direct conversion from the standalone package into the cloud provision platform can now consider energy and GHG emissions at the software development and cloud service design stage using the methods described in this research.
Resumo:
Bank of England notes of £20 denomination have been studied using infrared spectroscopy in order to generate a method to identify forged notes. An aim of this work was to develop a non-destructive method so that a small, compact Fourier transform infrared spectrometer (FT-IR) instrument could be used by bank workers, police departments or others such as shop assistants to identify forged notes in a non-lab setting. The ease of use of the instrument is the key to this method, as well as the relatively low cost. The presence of a peak at 1400 cm−1 arising from νasym () from the blank paper section of a forged note proved to be a successful indicator of the note’s illegality for the notes that we studied. Moreover, differences between the spectra of forged and genuine £20 notes were observed in the ν(OH) (ca. 3500 cm−1), ν(CH) (ca. 2900 cm−1) and ν(CO) (ca. 1750 cm−1) regions of the IR spectrum recorded for the polymer film covering the holographic strip. In cases where these simple tests fail, we have shown how an infrared microscope can be used to further differentiate genuine and forged banknotes by producing infrared maps of selected areas of the note contrasting inks with background paper.