916 resultados para High spectral resolution detectors


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The general circulation models used to simulate global climate typically feature resolution too coarse to reproduce many smaller-scale processes, which are crucial to determining the regional responses to climate change. A novel approach to downscale climate change scenarios is presented which includes the interactions between the North Atlantic Ocean and the European shelves as well as their impact on the North Atlantic and European climate. The goal of this paper is to introduce the global ocean-regional atmosphere coupling concept and to show the potential benefits of this model system to simulate present-day climate. A global ocean-sea ice-marine biogeochemistry model (MPIOM/HAMOCC) with regionally high horizontal resolution is coupled to an atmospheric regional model (REMO) and global terrestrial hydrology model (HD) via the OASIS coupler. Moreover, results obtained with ROM using NCEP/NCAR reanalysis and ECHAM5/MPIOM CMIP3 historical simulations as boundary conditions are presented and discussed for the North Atlantic and North European region. The validation of all the model components, i.e., ocean, atmosphere, terrestrial hydrology, and ocean biogeochemistry is performed and discussed. The careful and detailed validation of ROM provides evidence that the proposed model system improves the simulation of many aspects of the regional climate, remarkably the ocean, even though some biases persist in other model components, thus leaving potential for future improvement. We conclude that ROM is a powerful tool to estimate possible impacts of climate change on the regional scale.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Current commercially available Doppler lidars provide an economical and robust solution for measuring vertical and horizontal wind velocities, together with the ability to provide co- and cross-polarised backscatter profiles. The high temporal resolution of these instruments allows turbulent properties to be obtained from studying the variation in radial velocities. However, the instrument specifications mean that certain characteristics, especially the background noise behaviour, become a limiting factor for the instrument sensitivity in regions where the aerosol load is low. Turbulent calculations require an accurate estimate of the contribution from velocity uncertainty estimates, which are directly related to the signal-to-noise ratio. Any bias in the signal-to-noise ratio will propagate through as a bias in turbulent properties. In this paper we present a method to correct for artefacts in the background noise behaviour of commercially available Doppler lidars and reduce the signal-to-noise ratio threshold used to discriminate between noise, and cloud or aerosol signals. We show that, for Doppler lidars operating continuously at a number of locations in Finland, the data availability can be increased by as much as 50 % after performing this background correction and subsequent reduction in the threshold. The reduction in bias also greatly improves subsequent calculations of turbulent properties in weak signal regimes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Astronomy has evolved almost exclusively by the use of spectroscopic and imaging techniques, operated separately. With the development of modern technologies, it is possible to obtain data cubes in which one combines both techniques simultaneously, producing images with spectral resolution. To extract information from them can be quite complex, and hence the development of new methods of data analysis is desirable. We present a method of analysis of data cube (data from single field observations, containing two spatial and one spectral dimension) that uses Principal Component Analysis (PCA) to express the data in the form of reduced dimensionality, facilitating efficient information extraction from very large data sets. PCA transforms the system of correlated coordinates into a system of uncorrelated coordinates ordered by principal components of decreasing variance. The new coordinates are referred to as eigenvectors, and the projections of the data on to these coordinates produce images we will call tomograms. The association of the tomograms (images) to eigenvectors (spectra) is important for the interpretation of both. The eigenvectors are mutually orthogonal, and this information is fundamental for their handling and interpretation. When the data cube shows objects that present uncorrelated physical phenomena, the eigenvector`s orthogonality may be instrumental in separating and identifying them. By handling eigenvectors and tomograms, one can enhance features, extract noise, compress data, extract spectra, etc. We applied the method, for illustration purpose only, to the central region of the low ionization nuclear emission region (LINER) galaxy NGC 4736, and demonstrate that it has a type 1 active nucleus, not known before. Furthermore, we show that it is displaced from the centre of its stellar bulge.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In order to validate the Geant4 toolkit for dosimetry applications, simulations were performed to calculate conversion coefficients h(10, alpha) from air kerma free-in-air to personal dose equivalent Hp(10, a). The simulations consisted of two parts: the production of X-rays with radiation qualities of narrow and wide spectra, and the interaction of radiation with ICRU tissue-equivalent and ISO water slab phantoms. The half-value layers of the X-ray spectra obtained by simulation were compared with experimental results. Mean energy, spectral resolution, half-value layers and conversion coefficients were compared with ISO reference values. The good agreement between results from simulation and reference data shows that the Geant4 is suitable for dosimetry applications which involve photons with energies in the range of ten to a few hundreds of keV. (C) 2008 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a site-resolved study of stow (ms to s) motions in a protein in the solid (microcrystalline) state performed with the use of a modified version of the centerband-only detection of exchange (CODEX) NMR experiment. CODEX was originally based on measuring changes in molecular orientation by means of the chemical shift anisotropy (CSA) tensor, and in our modification, angular reorientations of internuclear vectors are observed. The experiment was applied to the study of stow (15)N-(1)H motions of the SH3 domain of chicken a-spectrin. The protein was perdeuterated with partial back-exchange of protons at labile sites. This allowed indirect (proton) detection of (15)N nuclei and thus a significant enhancement of sensitivity. The diluted proton system also made negligible proton-driven spin diffusion between (15)N nuclei, which interferes with the molecular exchange (motion) and hampers the acquisition of dynamic parameters. The experiment has shown that approximately half of the peaks in the 2D (15)N-(1)H correlation spectrum exhibit exchange in a different extent. The correlation time of the slow motion for most peaks is 1 to 3 s. This is the first NMR study of the internal dynamics of proteins in the solid state on the millisecond to second time scale with site-specific spectral resolution that provides both time-scale and geometry information about molecular motions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis describes the research undertaken for a degree of Master of Science in a retrospective study of airborne remotely sensed data registered in 1990 and 1993, and field captured data of aquatic humus concentrations for ~ 45 lakes in Tasmania. The aim was to investigate and describe the relationship between the remotely sensed data and the field data and to test the hypothesis that the remotely sensed data would establish further evidence of a limnological corridor of change running north-west to south- east. The airborne remotely sensed data consisted of data captured by the CSIRO Ocean Colour Scanner (OCS) and a newly developed Canadian scanner, a compact airborne spectrographic imager (CASI). The thesis investigates the relationship between the two kinds of data sources. The remotely sensed data was collected with the OCS scanner in 1990 (during one day) and with both the OCS and the CASI in 1993 (during three days). The OCS scanner registers data in 9 wavelength bands between 380 nm and 960 nm with a 10-20 nm bandwidth, and the CASI in 288 wavelength bands between 379.57 nm and 893.5 nm (ie. spectral mode) with a spectral resolution of 2.5 nm. The remotely sensed data were extracted from the original tapes with the help of the CSIRO and supplied software and digital sample areas (band value means) for each lake were subsequently extracted for data manipulation and statistical analysis. Field data was captured concurrently with the remotely sensed data in 1993 by lake hopping using a light aircraft with floats. The field data used for analysis with the remotely sensed data were the laboratory determined g440 values from the 1993 water samples collated with g440 values determined from earlier years. No spectro-radiometric data of the lakes, data of incoming irradiance or ancillary climatic data were captured during the remote sensing missions. The sections of the background chapter in the thesis provide a background to the research both in regards to remote sensing of water quality and the relationship between remotely sensed spectral data and water quality parameters, as well as a description of the Tasmanian lakes flown. The lakes were divided into four groups based on results from previous studies and optical parameters, especially aquatic humus concentrations as measured from field captured data. The four groups consist of the ‘green” clear water lakes mostly situated on the Central Plateau, the ‘brown” highly dystrophic lakes in western Tasmania, the ‘corridor” lakes situated along a corridor of change lying approximately between the two lines denoting the Jurassic edge and 1200 mm isohyet, and the ‘eastern, turbid” lakes make up the fourth group. The analytical part of the research work was mostly concerned with manipulating and analysing the CASI data because of its higher spectral resolution. The research explores methods to apply corrections to this data to reduce the disturbing effects of varying illumination and atmospheric conditions. Three different methods were attempted. In the first method two different standardisation formulas are applied to the data as well as ‘day correction” factors calculated from data from one of the lakes, Lake Rolleston, which had data captured for all three days of the remote sensing operations. The standardisation formulas were also applied to the OCS data. In second method an attempt to reduce the effects of the atmosphere was performed using spectro-radiometric captured in 1988 for one of the lakes flown, Great Lake. All the lake sample data were time normalised using general irradiance data obtained from the University of Tasmania and the sky portion as calculated from Great Lake upwelling irradiance data was then subtracted. The last method involved using two different band ratios to eliminate atmospheric effects. Statistical analysis was applied to the data resulting from the three methods to try to describe the relationship between the remotely sensed data and the field captured data. Discriminant analysis, cluster analysis and factor analysis using principal component analysis (pea) were applied to the remotely sensed data and the field data. The factor scores resulting from the pca were regressed against the field collated data of g440 as were the values resulting from last method. The results from the statistical analysis of the data from the first method show that the lakes group well (100%) against the predetermined groups using discriminant analysis applied to the remotely sensed CASI data. Most variance in the data are contained in the first factor resulting from pca regardless of data manipulation method. Regression of the factor scores against g440 field data show a strong non- linear relationship and a one-sided linear regression test is therefore considered an inappropriate analysis method to describe the dataset relationships. The research has shown that with the available data, correction and analysis methods, and within the scope of the Masters study, it was not possible to establish the relationships between the remotely sensed data and the field measured parameters as hoped. The main reason for this was the failure to retrieve remotely sensed lake signatures adequately corrected for atmospheric noise for comparison with the field data. This in turn is a result of the lack of detailed ancillary information needed to apply available established methods for noise reduction - to apply these methods we require field spectroradiometric measurements and environmental information of the varying conditions both within the study area and within the time frame of capture of the remotely sensed data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We have introduced an in-situ Raman monitoring technique to investigate the crystallization process inside protein drops. In addition to a conventional vapour-diffusion process, a novel procedure which actively stimulates the evaporation from a protein drop during crystallization was also evaluated, with lysozyme as a model protein. In contrast to the conventional vapour-diffusion condition, the evaporation-stimulated growth of crystals was initiated in a simple dehydration scheme and completed within a significantly shorter time. To gain an understanding of crystallization behaviours under the conditions with and without such evaporation stimulation, confocal Raman spectroscopy combined with linear regression analysis was used to monitor both lysozyme and HEPES buffer concentrations in real time. The confocal measurements having a high spatial resolution and good linear response revealed areas of local inhomogeneity in protein concentration when the crystallization started. The acquired concentration profiles indicated that (1)ÿthe evaporation-stimulated crystallization proceeded with protein concentrations lower than those under conventional vapour diffusion, and (2)ÿcrystals under the evaporation-stimulated condition were noticeable within an early stage of crystallization before the protein concentration approached its maximum value. The HEPES concentration profiles, on the other hand, increased steadily towards the end of the process regardless of the conditions used for crystallization. In particular, the observed local inhomogeneities specific to protein distribution suggested an accumulation mechanism of protein molecules that initiates the nucleation of crystals.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A 0.79C-1.5Si-1.98Mn-0.98Cr-0.24Mo-1.06Al-1.58Co (wt%) steel was isothermally heat treated at 350°C bainitic transformation temperature for 1 day to form fully bainitic structure with nano-layers of bainitic ferrite and retained austenite, while a 0.26C-1.96Si-2Mn-0.31Mo (wt%) steel was subjected to a successive isothermal heat treatment at 700°C for 300 min followed by 350°C for 120 min to form a hybrid microstructure consisting of ductile ferrite and fine scale bainite. The dislocation density and morphology of bainitic ferrite, and retained austenite characteristics such as size, and volume fraction were studied using Transmission Electron Microscopy. It was found that bainitic ferrite has high dislocation density for both steels. The retained austenite characteristics and bainite morphology were affected by composition of steels. Atom Probe Tomography (APT) has the high spatial resolution required for accurate determination of the carbon content of the bainitic ferrite and retained austenite, the solute distribution between these phases and calculation of the local composition of fine clusters and particles that allows to provide detailed insight into the bainite transformation of the steels. The carbon content of bainitic ferrite in both steels was found to be higher compared to the para-equilibrium level of carbon in ferrite. APT also revealed the presence of fine C-rich clusters and Fe-C carbides in bainitic ferrite of both steels.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Scanning white beam X-ray microdiffraction has been used to study the heterogeneous grain deformation in a polycrystalline Mg alloy (MgAZ31). The high spatial resolution achieved on beamline 7.3.3 at the Advanced Light Source provides a unique method to measure the elastic strain and orientation of single grains as a function of applied load. To carry out in-situ measurements a light weight (~0.5kg) tensile stage, capable of providing uniaxial loads of up to 600kg, was designed to collect diffraction data on the loading and unloading cycle. In-situ observation of the deformation process provides insight about the crystallographic deformation mode via twinning and dislocation slip.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recently effective connectivity studies have gained significant attention among the neuroscience community as Electroencephalography (EEG) data with a high time resolution can give us a wider understanding of the information flow within the brain. Among other tools used in effective connectivity analysis Granger Causality (GC) has found a prominent place. The GC analysis, based on strictly causal multivariate autoregressive (MVAR) models does not account for the instantaneous interactions among the sources. If instantaneous interactions are present, GC based on strictly causal MVAR will lead to erroneous conclusions on the underlying information flow. Thus, the work presented in this paper applies an extended MVAR (eMVAR) model that accounts for the zero lag interactions. We propose a constrained adaptive Kalman filter (CAKF) approach for the eMVAR model identification and demonstrate that this approach performs better than the short time windowing-based adaptive estimation when applied to information flow analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The noninvasive brain imaging modalities have provided us an extraordinary means for monitoring the working brain. Among these modalities, Electroencephalography (EEG) is the most widely used technique for measuring the brain signals under different tasks, due to its mobility, low cost, and high temporal resolution. In this paper we investigate the use of EEG signals in brain-computer interface (BCI) systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Geographer C. W. Thornthwaite proposed in 1948 a moisture index called Thornthwaite Moisture Index (TMI) as part of a water balance model for a new classification system for climate. The importance of TMI climatic classification has been recognised in many areas of knowledge and practice worldwide over the last 60 years. However, although past climate research was focused on developing adequate methods for climate classification, current research is more concerned with understanding the patterns of climate change. The use of TMI as an indicator for climate change is still an incipient area of research. The contributions of this paper are twofold. First, it is to fully document a methodology based on geostatistics adopted to produce a time series of TMI maps that are accurate and have high spatial resolution. The state of Victoria, in Australia, over the last century, is used as the case study. Second, by analysing these maps, the paper presents a general evaluation of the spatial patterns found in Victoria related to moisture variability across space and over time. Some potential implications of the verified moisture changes are discussed, and a number of ideas for further development are suggested. © 2014 Institute of Australian Geographers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present an experimental comparison of several through-space Hetero-nuclear Multiple-Quantum Correlation experiments, which allow the indirect observation of homo-nuclear single- (SQ) or double-quantum (DQ) 14N coherences via spy 1H nuclei. These 1H-{14N} D-HMQC sequences differ not only by the order of 14N coherences evolving during the indirect evolution, t1, but also by the radio-frequency (rf) scheme used to excite and reconvert these coherences under Magic-Angle Spinning (MAS). Here, the SQ coherences are created by the application of center-band frequency-selective pulses, i.e. long and low-power rectangular pulses at the 14N Larmor frequency, ν0(14N), whereas the DQ coherences are excited and reconverted using rf irradiation either at ν0(14N) or at the 14N overtone frequency, 2ν0(14N). The overtone excitation is achieved either by constant frequency rectangular pulses or by frequency-swept pulses, specifically Wide-band, Uniform-Rate, and Smooth-Truncation (WURST) pulse shapes. The present article compares the performances of four different 1H-{14N} D-HMQC sequences, including those with 14N rectangular pulses at ν0(14N) for the indirect detection of homo-nuclear (i) 14N SQ or (ii) DQ coherences, as well as their overtone variants using (iii) rectangular or (iv) WURST pulses. The compared properties include: (i) the sensitivity, (ii) the spectral resolution in the 14N dimension, (iii) the rf requirements (power and pulse length), as well as the robustness to (iv) rf offset and (v) MAS frequency instabilities. Such experimental comparisons are carried out for γ-glycine and l-histidine.HCl monohydrate, which contain 14N sites subject to moderate quadrupole interactions. We demonstrate that the optimum choice of the 1H-{14N} D-HMQC method depends on the experimental goal. When the sensitivity and/or the robustness to offset are the major concerns, the D-HMQC sequence allowing the indirect detection of 14N SQ coherences should be employed. Conversely, when the highest resolution and/or adjusted indirect spectral width are needed, overtone experiments are the method of choice. The overtone scheme using WURST pulses results in broader excitation bandwidths than that using rectangular pulses, at the expense of reduced sensitivity. Numerically exact simulations also show that the sensitivity of the overtone 1H-{14N} D-HMQC experiment increases for larger quadrupole interactions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Two-dimensional (2D) materials usually have a layer-dependent work function, which require fast and accurate detection for the evaluation of their device performance. A detection technique with high throughput and high spatial resolution has not yet been explored. Using a scanning electron microscope, we have developed and implemented a quantitative analytical technique which allows effective extraction of the work function of graphene. This technique uses the secondary electron contrast and has nanometre-resolved layer information. The measurement of few-layer graphene flakes shows the variation of work function between graphene layers with a precision of less than 10 meV. It is expected that this technique will prove extremely useful for researchers in a broad range of fields due to its revolutionary throughput and accuracy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Jakarta is vulnerable to flooding mainly caused by prolonged and heavy rainfall and thus a robust hydrological modeling is called for. A good quality of spatial precipitation data is therefore desired so that a good hydrological model could be achieved. Two types of rainfall sources are available: satellite and gauge station observations. At-site rainfall is considered to be a reliable and accurate source of rainfall. However, the limited number of stations makes the spatial interpolation not very much appealing. On the other hand, the gridded rainfall nowadays has high spatial resolution and improved accuracy, but still, relatively less accurate than its counterpart. To achieve a better precipitation data set, the study proposes cokriging method, a blending algorithm, to yield the blended satellite-gauge gridded rainfall at approximately 10-km resolution. The Global Satellite Mapping of Precipitation (GSMaP, 0.1⁰×0.1⁰) and daily rainfall observations from gauge stations are used. The blended product is compared with satellite data by cross-validation method. The newly-yield blended product is then utilized to re-calibrate the hydrological model. Several scenarios are simulated by the hydrological models calibrated by gauge observations alone and blended product. The performance of two calibrated hydrological models is then assessed and compared based on simulated and observed runoff.