855 resultados para synchroton-based techniques


Relevância:

30.00% 30.00%

Publicador:

Resumo:

A chiral bisurea-based superhydrogelator that is capable of forming supramolecular hydrogels at concentrations as low as 0.2 mm is reported. This soft material has been characterized by thermal studies, rheology, X-ray diffraction analysis, transmission electron microscopy (TEM), and by various spectroscopic techniques (electronic and vibrational circular dichroism and by FTIR and Raman spectroscopy). The expression of chirality on the molecular and supramolecular levels has been studied and a clear amplification of its chirality into the achiral analogue has been observed. Furthermore, thermal analysis showed that the hydroACHTUNGTRENUNGgel- ACHTUNGTRENUNGation of compound 1 has a high response to temperature, which corresponds to an enthalpy-driven self-assembly process. These particular thermal characteristics make these materials easy to handle for soft-application technologies

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The UK has a target for an 80% reduction in CO2 emissions by 2050 from a 1990 base. Domestic energy use accounts for around 30% of total emissions. This paper presents a comprehensive review of existing models and modelling techniques and indicates how they might be improved by considering individual buying behaviour. Macro (top-down) and micro (bottom-up) models have been reviewed and analysed. It is found that bottom-up models can project technology diffusion due to their higher resolution. The weakness of existing bottom-up models at capturing individual green technology buying behaviour has been identified. Consequently, Markov chains, neural networks and agent-based modelling are proposed as possible methods to incorporate buying behaviour within a domestic energy forecast model. Among the three methods, agent-based models are found to be the most promising, although a successful agent approach requires large amounts of input data. A prototype agent-based model has been developed and tested, which demonstrates the feasibility of an agent approach. This model shows that an agent-based approach is promising as a means to predict the effectiveness of various policy measures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The All-Weather Volcano Topography Imaging Sensor remote sensing instrument is a custom-built millimeter-wave (MMW) sensor that has been developed as a practical field tool for remote sensing of volcanic terrain at active lava domes. The portable instrument combines active and passive MMW measurements to record topographic and thermal data in almost all weather conditions from ground-based survey points. We describe how the instrument is deployed in the field, the quality of the primary ranging and radiometric measurements, and the postprocessing techniques used to derive the geophysical products of the target terrain, surface temperature, and reflectivity. By comparison of changing topography, we estimate the volume change and the lava extrusion rate. Validation of the MMW radiometry is also presented by quantitative comparison with coincident infrared thermal imagery.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Keyphrases are added to documents to help identify the areas of interest they contain. However, in a significant proportion of papers author selected keyphrases are not appropriate for the document they accompany: for instance, they can be classificatory rather than explanatory, or they are not updated when the focus of the paper changes. As such, automated methods for improving the use of keyphrases are needed, and various methods have been published. However, each method was evaluated using a different corpus, typically one relevant to the field of study of the method’s authors. This not only makes it difficult to incorporate the useful elements of algorithms in future work, but also makes comparing the results of each method inefficient and ineffective. This paper describes the work undertaken to compare five methods across a common baseline of corpora. The methods chosen were Term Frequency, Inverse Document Frequency, the C-Value, the NC-Value, and a Synonym based approach. These methods were analysed to evaluate performance and quality of results, and to provide a future benchmark. It is shown that Term Frequency and Inverse Document Frequency were the best algorithms, with the Synonym approach following them. Following these findings, a study was undertaken into the value of using human evaluators to judge the outputs. The Synonym method was compared to the original author keyphrases of the Reuters’ News Corpus. The findings show that authors of Reuters’ news articles provide good keyphrases but that more often than not they do not provide any keyphrases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ensemble-based data assimilation is rapidly proving itself as a computationally-efficient and skilful assimilation method for numerical weather prediction, which can provide a viable alternative to more established variational assimilation techniques. However, a fundamental shortcoming of ensemble techniques is that the resulting analysis increments can only span a limited subspace of the state space, whose dimension is less than the ensemble size. This limits the amount of observational information that can effectively constrain the analysis. In this paper, a data selection strategy that aims to assimilate only the observational components that matter most and that can be used with both stochastic and deterministic ensemble filters is presented. This avoids unnecessary computations, reduces round-off errors and minimizes the risk of importing observation bias in the analysis. When an ensemble-based assimilation technique is used to assimilate high-density observations, the data-selection procedure allows the use of larger localization domains that may lead to a more balanced analysis. Results from the use of this data selection technique with a two-dimensional linear and a nonlinear advection model using both in situ and remote sounding observations are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In order to make best use of the opportunities provided by space missions such as the Radiation Belt Storm Probes, we determine the response of complementary subionospheric radiowave propagation measurements (VLF), riometer absorption measurements (CNA), and GPS-produced total electron content (vTEC) to different energetic electron precipitation (EEP). We model the relative sensitivity and responses of these instruments to idealised monoenergetic beams of precipitating electrons, and more realistic EEP spectra chosen to represent radiation belts and substorm precipitation. In the monoenergetic beam case, we find riometers are more sensitive to the same EEP event occurring during the day than during the night, while subionospheric VLF shows the opposite relationship, and the change in vTEC is independent. In general, the subionospheric VLF measurements are much more sensitive than the other two techniques for EEP over 200 keV, responding to flux magnitudes two-three orders of magnitude smaller than detectable by a riometer. Detectable TEC changes only occur for extreme monoenergetic fluxes. For the radiation belt EEP case, clearly detectable subionospheric VLF responses are produced by daytime fluxes that are ~10 times lower than required for riometers, while nighttime fluxes can be 10,000 times lower. Riometers are likely to respond only to radiation belt fluxes during the largest EEP events and vTEC is unlikely to be significantly disturbed by radiation belt EEP. For the substorm EEP case both the riometer absorption and the subionospheric VLF technique respond significantly, as does the change in vTEC, which is likely to be detectable at ~3-4 TECu.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An ongoing controversy in Amazonian palaeoecology is the manner in which Amazonian rainforest communities have responded to environmental change over the last glacial–interglacial cycle. Much of this controversy results from an inability to identify the floristic heterogeneity exhibited by rainforest communities within fossil pollen records. We apply multivariate (Principal Components Analysis) and classification (Unweighted Pair Group with Arithmetic Mean Agglomerative Classification) techniques to floral-biometric, modern pollen trap and lake sediment pollen data situated within different rainforest communities in the tropical lowlands of Amazonian Bolivia. Modern pollen rain analyses from artificial pollen traps show that evergreen terra firme (well-drained), evergreen terra firme liana, evergreen seasonally inundated, and evergreen riparian rainforests may be readily differentiated, floristically and palynologically. Analogue matching techniques, based on Euclidean distance measures, are employed to compare these pollen signatures with surface sediment pollen assemblages from five lakes: Laguna Bella Vista, Laguna Chaplin, and Laguna Huachi situated within the Madeira-Tapajós moist forest ecoregion, and Laguna Isirere and Laguna Loma Suarez, which are situated within forest patches in the Beni savanna ecoregion. The same numerical techniques are used to compare rainforest pollen trap signatures with the fossil pollen record of Laguna Chaplin.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sea surface temperature (SST) can be estimated from day and night observations of the Spinning Enhanced Visible and Infra-Red Imager (SEVIRI) by optimal estimation (OE). We show that exploiting the 8.7 μm channel, in addition to the “traditional” wavelengths of 10.8 and 12.0 μm, improves OE SST retrieval statistics in validation. However, the main benefit is an improvement in the sensitivity of the SST estimate to variability in true SST. In a fair, single-pixel comparison, the 3-channel OE gives better results than the SST estimation technique presently operational within the Ocean and Sea Ice Satellite Application Facility. This operational technique is to use SST retrieval coefficients, followed by a bias-correction step informed by radiative transfer simulation. However, the operational technique has an additional “atmospheric correction smoothing”, which improves its noise performance, and hitherto had no analogue within the OE framework. Here, we propose an analogue to atmospheric correction smoothing, based on the expectation that atmospheric total column water vapour has a longer spatial correlation length scale than SST features. The approach extends the observations input to the OE to include the averaged brightness temperatures (BTs) of nearby clear-sky pixels, in addition to the BTs of the pixel for which SST is being retrieved. The retrieved quantities are then the single-pixel SST and the clear-sky total column water vapour averaged over the vicinity of the pixel. This reduces the noise in the retrieved SST significantly. The robust standard deviation of the new OE SST compared to matched drifting buoys becomes 0.39 K for all data. The smoothed OE gives SST sensitivity of 98% on average. This means that diurnal temperature variability and ocean frontal gradients are more faithfully estimated, and that the influence of the prior SST used is minimal (2%). This benefit is not available using traditional atmospheric correction smoothing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Problem-Based Learning, despite recent controversies about its effectiveness, is used extensively as a teaching method throughout higher education. In meteorology, there has been little attempt to incorporate Problem-Based Learning techniques into the curriculum. Motivated by a desire to enhance the reflective engagement of students within a current field course module, this project describes the implementation of two test Problem-Based Learning activities and testing and improvement using several different and complementary means of evaluation. By the end of a 2-year program of design, implementation, testing, and reflection and re-evaluation, two robust, engaging activities have been developed that provide an enhanced and diverse learning environment in the field course. The results suggest that Problem-Based Learning techniques would be a useful addition to the meteorology curriculum and suggestions for courses and activities that may benefit from this approach are included in the conclusions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cities, which are now inhabited by a majority of the world's population, are not only an important source of global environmental and resource depletion problems, but can also act as important centres of technological innovation and social learning in the continuing quest for a low carbon future. Planning and managing large-scale transitions in cities to deal with these pressures require an understanding of urban retrofitting at city scale. In this context performative techniques (such as backcasting and roadmapping) can provide valuable tools for helping cities develop a strategic view of the future. However, it is also important to identify ‘disruptive’ and ‘sustaining’ technologies which may contribute to city-based sustainability transitions. This paper presents research findings from the EPSRC Retrofit 2050 project, and explores the relationship between technology roadmaps and transition theory literature, highlighting the research gaps at urban/city level. The paper develops a research methodology to describe the development of three guiding visions for city-regional retrofit futures, and identifies key sustaining and disruptive technologies at city scale within these visions using foresight (horizon scanning) techniques. The implications of the research for city-based transition studies and related methodologies are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We have optimised the atmospheric radiation algorithm of the FAMOUS climate model on several hardware platforms. The optimisation involved translating the Fortran code to C and restructuring the algorithm around the computation of a single air column. Instead of the existing MPI-based domain decomposition, we used a task queue and a thread pool to schedule the computation of individual columns on the available processors. Finally, four air columns are packed together in a single data structure and computed simultaneously using Single Instruction Multiple Data operations. The modified algorithm runs more than 50 times faster on the CELL’s Synergistic Processing Elements than on its main PowerPC processing element. On Intel-compatible processors, the new radiation code runs 4 times faster. On the tested graphics processor, using OpenCL, we find a speed-up of more than 2.5 times as compared to the original code on the main CPU. Because the radiation code takes more than 60% of the total CPU time, FAMOUS executes more than twice as fast. Our version of the algorithm returns bit-wise identical results, which demonstrates the robustness of our approach. We estimate that this project required around two and a half man-years of work.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Techniques are proposed for evaluating forecast probabilities of events. The tools are especially useful when, as in the case of the Survey of Professional Forecasters (SPF) expected probability distributions of inflation, recourse cannot be made to the method of construction in the evaluation of the forecasts. The tests of efficiency and conditional efficiency are applied to the forecast probabilities of events of interest derived from the SPF distributions, and supplement a whole-density evaluation of the SPF distributions based on the probability integral transform approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new synthetic tripeptide-based hydrogel has been discovered at physiological pH and temperature. This hydrogel has been thoroughly characterized using different techniques including field emission scanning electron microscopic (FESEM) and high-resolution transmission electron microscopic (HR-TEM) imaging, small- and wide-angle X-ray diffraction analyses, FT-IR, circular dichroism, and rheometric analyses. Moreover, this gel exhibits thixotropy and injectability. This hydrogel has been used for entrapment and sustained release of an antibiotic vancomycin and vitamin B12 at physiological pH and temperature for about 2 days. Interestingly, MTT assay of these gelator molecules shows almost 100% cell viability of this peptide gelator, indicating its noncytotoxicity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present five new cloud detection algorithms over land based on dynamic threshold or Bayesian techniques, applicable to the Advanced Along Track Scanning Radiometer (AATSR) instrument and compare these with the standard threshold based SADIST cloud detection scheme. We use a manually classified dataset as a reference to assess algorithm performance and quantify the impact of each cloud detection scheme on land surface temperature (LST) retrieval. The use of probabilistic Bayesian cloud detection methods improves algorithm true skill scores by 8-9 % over SADIST (maximum score of 77.93 % compared to 69.27 %). We present an assessment of the impact of imperfect cloud masking, in relation to the reference cloud mask, on the retrieved AATSR LST imposing a 2 K tolerance over a 3x3 pixel domain. We find an increase of 5-7 % in the observations falling within this tolerance when using Bayesian methods (maximum of 92.02 % compared to 85.69 %). We also demonstrate that the use of dynamic thresholds in the tests employed by SADIST can significantly improve performance, applicable to cloud-test data to provided by the Sea and Land Surface Temperature Radiometer (SLSTR) due to be launched on the Sentinel 3 mission (estimated 2014).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Automatic generation of classification rules has been an increasingly popular technique in commercial applications such as Big Data analytics, rule based expert systems and decision making systems. However, a principal problem that arises with most methods for generation of classification rules is the overfit-ting of training data. When Big Data is dealt with, this may result in the generation of a large number of complex rules. This may not only increase computational cost but also lower the accuracy in predicting further unseen instances. This has led to the necessity of developing pruning methods for the simplification of rules. In addition, classification rules are used further to make predictions after the completion of their generation. As efficiency is concerned, it is expected to find the first rule that fires as soon as possible by searching through a rule set. Thus a suit-able structure is required to represent the rule set effectively. In this chapter, the authors introduce a unified framework for construction of rule based classification systems consisting of three operations on Big Data: rule generation, rule simplification and rule representation. The authors also review some existing methods and techniques used for each of the three operations and highlight their limitations. They introduce some novel methods and techniques developed by them recently. These methods and techniques are also discussed in comparison to existing ones with respect to efficient processing of Big Data.