86 resultados para Visualization technique

em CentAUR: Central Archive University of Reading - UK


Relevância:

100.00% 100.00%

Publicador:

Resumo:

We have studied the degradation of sebaceous fingerprints on brass surfaces using silver electroless deposition (SED) as a visualization technique. We have stored fingerprints on brass squares either (i) in a locked dark cupboard or (ii) in glass-filtered natural daylight for periods of 3 h, 24 h, 1 week, 3 weeks, and 6 weeks. We find that fingerprints on brass surfaces degrade much more rapidly when kept in the light than they do under dark conditions with a much higher proportion of high-quality prints found after 3 or 6 weeks of aging when stored in the dark. This process is more marked than for similar fingerprints on black PVC surfaces. Identifiable prints can be achieved on brass surfaces using both SED and cyanoacrylate fuming (CFM). SED is quick and straightforward to perform. CFM is more time-consuming but is versatile and can be applied to a wider range of metal surfaces than SED, for example brass surfaces which have been coated by a lacquer.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

As Terabyte datasets become the norm, the focus has shifted away from our ability to produce and store ever larger amounts of data, onto its utilization. It is becoming increasingly difficult to gain meaningful insights into the data produced. Also many forms of the data we are currently producing cannot easily fit into traditional visualization methods. This paper presents a new and novel visualization technique based on the concept of a Data Forest. Our Data Forest has been designed to be used with vir tual reality (VR) as its presentation method. VR is a natural medium for investigating large datasets. Our approach can easily be adapted to be used in a variety of different ways, from a stand alone single user environment to large multi-user collaborative environments. A test application is presented using multi-dimensional data to demonstrate the concepts involved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

As we increase our ability to produce and store ever larger amounts of data, it is becoming increasingly difficult to understand what the data is trying to tell us. Not all the data we are currently producing can easily fit into traditional visualization methods. This paper presents a new and novel visualization technique based on the concept of a Data Forest. Our Data Forest has been developed to be utilised by virtual reality (VR) systems. VR is a natural information medium. This approach can easily be adapted to be used in collaborative environments. A test application has been developed to demonstrate the concepts involved and a collaborative version tested.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The identification and visualization of clusters formed by motor unit action potentials (MUAPs) is an essential step in investigations seeking to explain the control of the neuromuscular system. This work introduces the generative topographic mapping (GTM), a novel machine learning tool, for clustering of MUAPs, and also it extends the GTM technique to provide a way of visualizing MUAPs. The performance of GTM was compared to that of three other clustering methods: the self-organizing map (SOM), a Gaussian mixture model (GMM), and the neural-gas network (NGN). The results, based on the study of experimental MUAPs, showed that the rate of success of both GTM and SOM outperformed that of GMM and NGN, and also that GTM may in practice be used as a principled alternative to the SOM in the study of MUAPs. A visualization tool, which we called GTM grid, was devised for visualization of MUAPs lying in a high-dimensional space. The visualization provided by the GTM grid was compared to that obtained from principal component analysis (PCA). (c) 2005 Elsevier Ireland Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Self-Organizing Map (SOM) is a popular unsupervised neural network able to provide effective clustering and data visualization for multidimensional input datasets. In this paper, we present an application of the simulated annealing procedure to the SOM learning algorithm with the aim to obtain a fast learning and better performances in terms of quantization error. The proposed learning algorithm is called Fast Learning Self-Organized Map, and it does not affect the easiness of the basic learning algorithm of the standard SOM. The proposed learning algorithm also improves the quality of resulting maps by providing better clustering quality and topology preservation of input multi-dimensional data. Several experiments are used to compare the proposed approach with the original algorithm and some of its modification and speed-up techniques.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper introduces the Hilbert Analysis (HA), which is a novel digital signal processing technique, for the investigation of tremor. The HA is formed by two complementary tools, i.e. the Empirical Mode Decomposition (EMD) and the Hilbert Spectrum (HS). In this work we show that the EMD can automatically detect and isolate tremulous and voluntary movements from experimental signals collected from 31 patients with different conditions. Our results also suggest that the tremor may be described by a new class of mathematical functions defined in the HA framework. In a further study, the HS was employed for visualization of the energy activities of signals. This tool introduces the concept of instantaneous frequency in the field of tremor. In addition, it could provide, in a time-frequency-energy plot, a clear visualization of local activities of tremor energy over the time. The HA demonstrated to be very useful to perform objective measurements of any kind of tremor and can therefore be used to perform functional assessment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper introduces the Hilbert Analysis (HA), which is a novel digital signal processing technique, for the investigation of tremor. The HA is formed by two complementary tools, i.e. the Empirical Mode Decomposition (EMD) and the Hilbert Spectrum (HS). In this work we show that the EMD can automatically detect and isolate tremulous and voluntary movements from experimental signals collected from 31 patients with different conditions. Our results also suggest that the tremor may be described by a new class of mathematical functions defined in the HA framework. In a further study, the HS was employed for visualization of the energy activities of signals. This tool introduces the concept of instantaneous frequency in the field of tremor. In addition, it could provide, in a time-frequency energy plot, a clear visualization of local activities of tremor energy over the time. The HA demonstrated to be very useful to perform objective measurements of any kind of tremor and can therefore be used to perform functional assessment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Compute grids are used widely in many areas of environmental science, but there has been limited uptake of grid computing by the climate modelling community, partly because the characteristics of many climate models make them difficult to use with popular grid middleware systems. In particular, climate models usually produce large volumes of output data, and running them usually involves complicated workflows implemented as shell scripts. For example, NEMO (Smith et al. 2008) is a state-of-the-art ocean model that is used currently for operational ocean forecasting in France, and will soon be used in the UK for both ocean forecasting and climate modelling. On a typical modern cluster, a particular one year global ocean simulation at 1-degree resolution takes about three hours when running on 40 processors, and produces roughly 20 GB of output as 50000 separate files. 50-year simulations are common, during which the model is resubmitted as a new job after each year. Running NEMO relies on a set of complicated shell scripts and command utilities for data pre-processing and post-processing prior to job resubmission. Grid Remote Execution (G-Rex) is a pure Java grid middleware system that allows scientific applications to be deployed as Web services on remote computer systems, and then launched and controlled as if they are running on the user's own computer. Although G-Rex is general purpose middleware it has two key features that make it particularly suitable for remote execution of climate models: (1) Output from the model is transferred back to the user while the run is in progress to prevent it from accumulating on the remote system and to allow the user to monitor the model; (2) The client component is a command-line program that can easily be incorporated into existing model work-flow scripts. G-Rex has a REST (Fielding, 2000) architectural style, which allows client programs to be very simple and lightweight and allows users to interact with model runs using only a basic HTTP client (such as a Web browser or the curl utility) if they wish. This design also allows for new client interfaces to be developed in other programming languages with relatively little effort. The G-Rex server is a standard Web application that runs inside a servlet container such as Apache Tomcat and is therefore easy to install and maintain by system administrators. G-Rex is employed as the middleware for the NERC1 Cluster Grid, a small grid of HPC2 clusters belonging to collaborating NERC research institutes. Currently the NEMO (Smith et al. 2008) and POLCOMS (Holt et al, 2008) ocean models are installed, and there are plans to install the Hadley Centre’s HadCM3 model for use in the decadal climate prediction project GCEP (Haines et al., 2008). The science projects involving NEMO on the Grid have a particular focus on data assimilation (Smith et al. 2008), a technique that involves constraining model simulations with observations. The POLCOMS model will play an important part in the GCOMS project (Holt et al, 2008), which aims to simulate the world’s coastal oceans. A typical use of G-Rex by a scientist to run a climate model on the NERC Cluster Grid proceeds as follows :(1) The scientist prepares input files on his or her local machine. (2) Using information provided by the Grid’s Ganglia3 monitoring system, the scientist selects an appropriate compute resource. (3) The scientist runs the relevant workflow script on his or her local machine. This is unmodified except that calls to run the model (e.g. with “mpirun”) are simply replaced with calls to "GRexRun" (4) The G-Rex middleware automatically handles the uploading of input files to the remote resource, and the downloading of output files back to the user, including their deletion from the remote system, during the run. (5) The scientist monitors the output files, using familiar analysis and visualization tools on his or her own local machine. G-Rex is well suited to climate modelling because it addresses many of the middleware usability issues that have led to limited uptake of grid computing by climate scientists. It is a lightweight, low-impact and easy-to-install solution that is currently designed for use in relatively small grids such as the NERC Cluster Grid. A current topic of research is the use of G-Rex as an easy-to-use front-end to larger-scale Grid resources such as the UK National Grid service.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We describe a remote sensing method for measuring the internal interface height field in a rotating, two-layer annulus laboratory experiment. The method is non-invasive, avoiding the possibility of an interaction between the flow and the measurement device. The height fields retrieved are accurate and highly resolved in both space and time. The technique is based on a flow visualization method developed by previous workers, and relies upon the optical rotation properties of the working liquids. The previous methods returned only qualitative interface maps, however. In the present study, a technique is developed for deriving quantitative maps by calibrating height against the colour fields registered by a camera which views the flow from above. We use a layer-wise torque balance analysis to determine the equilibrium interface height field analytically, in order to derive the calibration curves. With the current system, viewing an annulus of outer radius 125 mm and depth 250 mm from a distance of 2 m, the inferred height fields have horizontal, vertical and temporal resolutions of up to 0.2 mm, 1 mm and 0.04 s, respectively.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

GODIVA2 is a dynamic website that provides visual access to several terabytes of physically distributed, four-dimensional environmental data. It allows users to explore large datasets interactively without the need to install new software or download and understand complex data. Through the use of open international standards, GODIVA2 maintains a high level of interoperability with third-party systems, allowing diverse datasets to be mutually compared. Scientists can use the system to search for features in large datasets and to diagnose the output from numerical simulations and data processing algorithms. Data providers around Europe have adopted GODIVA2 as an INSPIRE-compliant dynamic quick-view system for providing visual access to their data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Sorghum (Sorghum bicolor) was grown for 40 days in. rhizocylinder (a growth container which permitted access to rh zosphere and nonrhizosphere soil), in two soils of low P status. Soils were fertilized with different rates of ammonium and nitrate and supplemented with 40 mg phosphorus (P) kg(-1) and inoculated with either Glomus mosseae (Nicol. and Gerd.) or nonmycorrhizal root inoculum.. N-serve (2 mg kg(-1)) was added to prevent nitrification. At harvest, soil from around the roots was collected at distances of 0-5, 5-10, and 10-20 mm from the root core which was 35 mm diameter. Sorghum plants, with and without mycorrhiza, grew larger with NH4+ than with NO3- application. After measuring soil pH, 4 3 suspensions of the same sample were titrated against 0.01 M HCl or 0.01 M NaOH until soil pH reached the nonplanted pH level. The acid or base requirement for each sample was calculated as mmol H+ or OFF kg(-1) soil. The magnitude of liberated acid or base depended on the form and rate of nitrogen and soil type. When the plant root was either uninfected or infected with mycorrhiza., soil pH changes extended up to 5 mm from the root core surface. In both soils, ammonium as an N source resulted in lower soil pH than nitrate. Mycorrhizal (VAM) inoculation did not enhance this difference. In mycorrhizal inoculated soil, P depletion extended tip to 20 mm from the root surface. In non-VAM inoculated soil P depletion extended up to 10 mm from the root surface and remained unchanged at greater distances. In the mycorrhizal inoculated soils, the contribution of the 0-5 mm soil zone to P uptake was greater than the core soil, which reflects the hyphal contribution to P supply. Nitrogen (N) applications that caused acidification increased P uptake because of increased demand; there is no direct evidence that the increased uptake was due to acidity increasing the solubility of P although this may have been a minor effect.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A range of archaeological samples have been examined using FT-IR spectroscopy. These include suspected coprolite samples from the Neolithic site of Catalhoyuk in Turkey, pottery samples from the Roman site of Silchester, UK and the Bronze Age site of Gatas, Spain and unidentified black residues on pottery sherds from the Roman sites of Springhead and Cambourne, UK. For coprolite samples the aim of FT-IR analysis is identification. Identification of coprolites in the field is based on their distinct orange colour; however, such visual identifications can often be misleading due to their similarity with deposits such as ochre and clay. For pottery the aim is to screen those samples that might contain high levels of organic residues which would be suitable for GC-MS analysis. The experiments have shown coprolites to have distinctive spectra, containing strong peaks from calcite, phosphate and quartz; the presence of phosphorus may be confirmed by SEM-EDX analysis. Pottery containing organic residues of plant and animal origin has also been shown to generally display strong phosphate peaks. FT-IR has distinguished between organic resin and non-organic compositions for the black residues, with differences also being seen between organic samples that have the same physical appearance. Further analysis by CC-MS has confirmed the identification of the coprolites through the presence of coprostanol and bile acids, and shows that the majority of organic pottery residues are either fatty acids or mono- or di-acylglycerols from foodstuffs, or triterpenoid resin compounds exposed to high temperatures. One suspected resin sample was shown to contain no organic residues. and it is seen that resin samples with similar physical appearances have different chemical compositions. FT-IR is proposed as a quick and cheap method of screening archaeological samples before subjecting them to the more expensive and time-consuming method of GC-MS. This will eliminate inorganic samples such as clays and ochre from CC-MS analysis, and will screen those samples which are most likely to have a high concentration of preserved organic residues. (C) 2008 Elsevier B.V. All rights reserved.