116 resultados para Chromatographic techniques


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Grassland restoration is the dominant activity funded by agri-environment schemes (AES). However, the re-instatement of biodiversity and ecosystem services is limited by a number of severe abiotic and biotic constraints resulting from previous agricultural management. These appear to be less severe on ex-arable sites compared with permanent grassland. We report findings of a large research programme into practical solutions to these constraints. The key abiotic constraint was high residual soil fertility, particularly phosphorus. This can most easily be addressed by targeting of sites of low nutrient status. The chief biotic constraints were lack of propagules of desirable species and suitable sites for their establishment. Addition of seed mixtures or green hay to gaps created by either mechanical disturbance or herbicide was the most effective means of overcoming these factors. Finally, manipulation of biotic interactions, including hemiparasitic plants to reduce competition from grasses and control of mollusc herbivory of sown species, significantly improved the effectiveness of these techniques.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents findings of our study on peer-reviewed papers published in the International Conference on Persuasive Technology from 2006 to 2010. The study indicated that out of 44 systems reviewed, 23 were reported to be successful, 2 to be unsuccessful and 19 did not specify whether or not it was successful. 56 different techniques were mentioned and it was observed that most designers use ad hoc definitions for techniques or methods used in design. Hence we propose the need for research to establish unambiguous definitions of techniques and methods in the field.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Advances in hardware and software technology enable us to collect, store and distribute large quantities of data on a very large scale. Automatically discovering and extracting hidden knowledge in the form of patterns from these large data volumes is known as data mining. Data mining technology is not only a part of business intelligence, but is also used in many other application areas such as research, marketing and financial analytics. For example medical scientists can use patterns extracted from historic patient data in order to determine if a new patient is likely to respond positively to a particular treatment or not; marketing analysts can use extracted patterns from customer data for future advertisement campaigns; finance experts have an interest in patterns that forecast the development of certain stock market shares for investment recommendations. However, extracting knowledge in the form of patterns from massive data volumes imposes a number of computational challenges in terms of processing time, memory, bandwidth and power consumption. These challenges have led to the development of parallel and distributed data analysis approaches and the utilisation of Grid and Cloud computing. This chapter gives an overview of parallel and distributed computing approaches and how they can be used to scale up data mining to large datasets.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Keyphrases are added to documents to help identify the areas of interest they contain. However, in a significant proportion of papers author selected keyphrases are not appropriate for the document they accompany: for instance, they can be classificatory rather than explanatory, or they are not updated when the focus of the paper changes. As such, automated methods for improving the use of keyphrases are needed, and various methods have been published. However, each method was evaluated using a different corpus, typically one relevant to the field of study of the method’s authors. This not only makes it difficult to incorporate the useful elements of algorithms in future work, but also makes comparing the results of each method inefficient and ineffective. This paper describes the work undertaken to compare five methods across a common baseline of corpora. The methods chosen were Term Frequency, Inverse Document Frequency, the C-Value, the NC-Value, and a Synonym based approach. These methods were analysed to evaluate performance and quality of results, and to provide a future benchmark. It is shown that Term Frequency and Inverse Document Frequency were the best algorithms, with the Synonym approach following them. Following these findings, a study was undertaken into the value of using human evaluators to judge the outputs. The Synonym method was compared to the original author keyphrases of the Reuters’ News Corpus. The findings show that authors of Reuters’ news articles provide good keyphrases but that more often than not they do not provide any keyphrases.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Khartoum like many cities in least developing countries (LDCs) still witnesses huge influx of people. Accommodation of the new comers leads to encroachment on the cultivation land leads to sprawl expansion of Greater Khartoum. The city expanded in diameter from 16.8 km in 1955 to 802.5 km in 1998. Most of this horizontal expansion was residential. In 2008 Khartoum accommodated 29% of the urban population of Sudan. Today Khartoum is considered as one of 43 major cities in Africa that accommodates more than 1 million inhabitants. Most of new comers live in the outskirts of the city e.g. Dar El-Salam and Mayo neighbourhoods. The majority of those new comers built their houses especially the walls from mud, wood, straw and sacks. Selection of building materials usually depends on its price regardless of the environmental impact, quality, thermal performance and life of the material. Most of the time, this results in increasing the cost with variables of impacts over the environment during the life of the building. Therefore, consideration of the environmental impacts, social impacts and economic impacts is crucial in the selection of any building material. Decreasing such impacts could lead to more sustainable housing. Comparing the sustainability of the available wall building materials for low cost housing in Khartoum is carried out through the life cycle assessment (LCA) technique. The purpose of this paper is to compare the most available local building materials for walls for the urban poor of Khartoum from a sustainability point of view by going through the manufacturing of the materials, the use of these materials and then the disposal of the materials after their life comes to an end. Findings reveal that traditional red bricks couldn’t be considered as a sustainable wall building material that will draw the future of the low cost housing in Greater Khartoum. On the other hand, results of the comparison lead to draw attention to the wide range of the soil techniques and to its potentials to be a promising sustainable wall material for urban low cost housing in Khartoum.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

n the past decade, the analysis of data has faced the challenge of dealing with very large and complex datasets and the real-time generation of data. Technologies to store and access these complex and large datasets are in place. However, robust and scalable analysis technologies are needed to extract meaningful information from these datasets. The research field of Information Visualization and Visual Data Analytics addresses this need. Information visualization and data mining are often used complementary to each other. Their common goal is the extraction of meaningful information from complex and possibly large data. However, though data mining focuses on the usage of silicon hardware, visualization techniques also aim to access the powerful image-processing capabilities of the human brain. This article highlights the research on data visualization and visual analytics techniques. Furthermore, we highlight existing visual analytics techniques, systems, and applications including a perspective on the field from the chemical process industry.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Sea surface temperature (SST) can be estimated from day and night observations of the Spinning Enhanced Visible and Infra-Red Imager (SEVIRI) by optimal estimation (OE). We show that exploiting the 8.7 μm channel, in addition to the “traditional” wavelengths of 10.8 and 12.0 μm, improves OE SST retrieval statistics in validation. However, the main benefit is an improvement in the sensitivity of the SST estimate to variability in true SST. In a fair, single-pixel comparison, the 3-channel OE gives better results than the SST estimation technique presently operational within the Ocean and Sea Ice Satellite Application Facility. This operational technique is to use SST retrieval coefficients, followed by a bias-correction step informed by radiative transfer simulation. However, the operational technique has an additional “atmospheric correction smoothing”, which improves its noise performance, and hitherto had no analogue within the OE framework. Here, we propose an analogue to atmospheric correction smoothing, based on the expectation that atmospheric total column water vapour has a longer spatial correlation length scale than SST features. The approach extends the observations input to the OE to include the averaged brightness temperatures (BTs) of nearby clear-sky pixels, in addition to the BTs of the pixel for which SST is being retrieved. The retrieved quantities are then the single-pixel SST and the clear-sky total column water vapour averaged over the vicinity of the pixel. This reduces the noise in the retrieved SST significantly. The robust standard deviation of the new OE SST compared to matched drifting buoys becomes 0.39 K for all data. The smoothed OE gives SST sensitivity of 98% on average. This means that diurnal temperature variability and ocean frontal gradients are more faithfully estimated, and that the influence of the prior SST used is minimal (2%). This benefit is not available using traditional atmospheric correction smoothing.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cities, which are now inhabited by a majority of the world's population, are not only an important source of global environmental and resource depletion problems, but can also act as important centres of technological innovation and social learning in the continuing quest for a low carbon future. Planning and managing large-scale transitions in cities to deal with these pressures require an understanding of urban retrofitting at city scale. In this context performative techniques (such as backcasting and roadmapping) can provide valuable tools for helping cities develop a strategic view of the future. However, it is also important to identify ‘disruptive’ and ‘sustaining’ technologies which may contribute to city-based sustainability transitions. This paper presents research findings from the EPSRC Retrofit 2050 project, and explores the relationship between technology roadmaps and transition theory literature, highlighting the research gaps at urban/city level. The paper develops a research methodology to describe the development of three guiding visions for city-regional retrofit futures, and identifies key sustaining and disruptive technologies at city scale within these visions using foresight (horizon scanning) techniques. The implications of the research for city-based transition studies and related methodologies are discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We have optimised the atmospheric radiation algorithm of the FAMOUS climate model on several hardware platforms. The optimisation involved translating the Fortran code to C and restructuring the algorithm around the computation of a single air column. Instead of the existing MPI-based domain decomposition, we used a task queue and a thread pool to schedule the computation of individual columns on the available processors. Finally, four air columns are packed together in a single data structure and computed simultaneously using Single Instruction Multiple Data operations. The modified algorithm runs more than 50 times faster on the CELL’s Synergistic Processing Elements than on its main PowerPC processing element. On Intel-compatible processors, the new radiation code runs 4 times faster. On the tested graphics processor, using OpenCL, we find a speed-up of more than 2.5 times as compared to the original code on the main CPU. Because the radiation code takes more than 60% of the total CPU time, FAMOUS executes more than twice as fast. Our version of the algorithm returns bit-wise identical results, which demonstrates the robustness of our approach. We estimate that this project required around two and a half man-years of work.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present five new cloud detection algorithms over land based on dynamic threshold or Bayesian techniques, applicable to the Advanced Along Track Scanning Radiometer (AATSR) instrument and compare these with the standard threshold based SADIST cloud detection scheme. We use a manually classified dataset as a reference to assess algorithm performance and quantify the impact of each cloud detection scheme on land surface temperature (LST) retrieval. The use of probabilistic Bayesian cloud detection methods improves algorithm true skill scores by 8-9 % over SADIST (maximum score of 77.93 % compared to 69.27 %). We present an assessment of the impact of imperfect cloud masking, in relation to the reference cloud mask, on the retrieved AATSR LST imposing a 2 K tolerance over a 3x3 pixel domain. We find an increase of 5-7 % in the observations falling within this tolerance when using Bayesian methods (maximum of 92.02 % compared to 85.69 %). We also demonstrate that the use of dynamic thresholds in the tests employed by SADIST can significantly improve performance, applicable to cloud-test data to provided by the Sea and Land Surface Temperature Radiometer (SLSTR) due to be launched on the Sentinel 3 mission (estimated 2014).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Older adult computer users often lose track of the mouse cursor and so resort to methods such as shaking the mouse or searching the entire screen to find the cursor again. Hence, this paper describes how a standard optical mouse was modified to include a touch sensor, activated by releasing and touching the mouse, which automatically centers the mouse cursor to the screen, potentially making it easier to find a ‘lost’ cursor. Six older adult computer users and six younger computer users were asked to compare the touch sensitive mouse with cursor centering with two alternative techniques for locating the mouse cursor: manually shaking the mouse and using the Windows sonar facility. The time taken to click on a target after a distractor task was recorded, and results show that centering the mouse was the fastest to use, with a 35% improvement over shaking the mouse. Five out of six older participants ranked the touch sensitive mouse with cursor centering as the easiest to use.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Time series of global and regional mean Surface Air Temperature (SAT) anomalies are a common metric used to estimate recent climate change. Various techniques can be used to create these time series from meteorological station data. The degree of difference arising from using five different techniques, based on existing temperature anomaly dataset techniques, to estimate Arctic SAT anomalies over land and sea ice were investigated using reanalysis data as a testbed. Techniques which interpolated anomalies were found to result in smaller errors than non-interpolating techniques relative to the reanalysis reference. Kriging techniques provided the smallest errors in estimates of Arctic anomalies and Simple Kriging was often the best kriging method in this study, especially over sea ice. A linear interpolation technique had, on average, Root Mean Square Errors (RMSEs) up to 0.55 K larger than the two kriging techniques tested. Non-interpolating techniques provided the least representative anomaly estimates. Nonetheless, they serve as useful checks for confirming whether estimates from interpolating techniques are reasonable. The interaction of meteorological station coverage with estimation techniques between 1850 and 2011 was simulated using an ensemble dataset comprising repeated individual years (1979-2011). All techniques were found to have larger RMSEs for earlier station coverages. This supports calls for increased data sharing and data rescue, especially in sparsely observed regions such as the Arctic.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper investigates the effect of drama techniques when employed to facilitate teaching and learning early years science. The focus is a lesson intervention designed for a group of children aged between four and five years old. A number of different drama techniques, such as teacher in role, hot seating and miming, were employed for the teaching of the water cycle. The techniques were implemented based on their nature and on what they can offer to young children considering their previous experiences. Before the beginning of the intervention, six children were randomly selected from the whole class, who were interviewed, aiming to identify their initial ideas in regards to the water cycle. The same children were interviewed after the end of the intervention in an attempt to identify the ways in which their initial ideas were changed. The results appear to be promising in terms of facilitating children’s scientific understanding and show an improvement in the children’s use of vocabulary in relation to the specific topic.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Wall plaster sequences from the Neolithic town of Çatalhöyük have been analysed and compared to three types of natural sediment found in the vicinity of the site, using a range of analytical techniques. Block samples containing the plaster sequences were removed from the walls of several different buildings on the East Mound. Sub-samples were examined by IR spectroscopy, X-ray diffraction and X-ray fluorescence to determine the overall mineralogical and elemental composition, whilst thin sections were studied using optical polarising microscopy, IR Microscopy and Environmental Scanning Electron Microscopy with Energy Dispersive X-ray analysis. The results of this study have shown that there are two types of wall plaster found in the sequences and that the sediments used to produce these were obtained from at least two distinct sources. In particular, the presence of clay, calcite and magnesian calcite in the foundation plasters suggested that these were prepared predominantly from a marl source. On the other hand, the finishing plasters were found to contain dolomite with a small amount of clay and no calcite, revealing that softlime was used in their preparation. Whilst marl is located directly below and around Çatalhöyük, the nearest source of softlime is 6.5 km away, an indication that the latter was important to the Neolithic people, possibly due to the whiter colour (5Y 8/1) of this sediment. Furthermore, the same two plaster types were found on each wall of Building 49, the main building studied in this research, and in all five buildings investigated, suggesting that the use of these sources was an established practice for the inhabitants of several different households across the site.