917 resultados para rapid thermal processing (RTP)


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The signal processing techniques developed for the diagnostics of mechanical components operating in stationary conditions are often not applicable or are affected by a loss of effectiveness when applied to signals measured in transient conditions. In this chapter, an original signal processing tool is developed exploiting some data-adaptive techniques such as Empirical Mode Decomposition, Minimum Entropy Deconvolution and the analytical approach of the Hilbert transform. The tool has been developed to detect localized faults on bearings of traction systems of high speed trains and it is more effective to detect a fault in non-stationary conditions than signal processing tools based on envelope analysis or spectral kurtosis, which represent until now the landmark for bearings diagnostics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Efficient and effective feature detection and representation is an important consideration when processing videos, and a large number of applications such as motion analysis, 3D scene understanding, tracking etc. depend on this. Amongst several feature description methods, local features are becoming increasingly popular for representing videos because of their simplicity and efficiency. While they achieve state-of-the-art performance with low computational complexity, their performance is still too limited for real world applications. Furthermore, rapid increases in the uptake of mobile devices has increased the demand for algorithms that can run with reduced memory and computational requirements. In this paper we propose a semi binary based feature detectordescriptor based on the BRISK detector, which can detect and represent videos with significantly reduced computational requirements, while achieving comparable performance to the state of the art spatio-temporal feature descriptors. First, the BRISK feature detector is applied on a frame by frame basis to detect interest points, then the detected key points are compared against consecutive frames for significant motion. Key points with significant motion are encoded with the BRISK descriptor in the spatial domain and Motion Boundary Histogram in the temporal domain. This descriptor is not only lightweight but also has lower memory requirements because of the binary nature of the BRISK descriptor, allowing the possibility of applications using hand held devices.We evaluate the combination of detectordescriptor performance in the context of action classification with a standard, popular bag-of-features with SVM framework. Experiments are carried out on two popular datasets with varying complexity and we demonstrate comparable performance with other descriptors with reduced computational complexity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Acoustic recordings of the environment are an important aid to ecologists monitoring biodiversity and environmental health. However, rapid advances in recording technology, storage and computing make it possible to accumulate thousands of hours of recordings, of which, ecologists can only listen to a small fraction. The big-data challenge is to visualize the content of long-duration audio recordings on multiple scales, from hours, days, months to years. The visualization should facilitate navigation and yield ecologically meaningful information. Our approach is to extract (at one minute resolution) acoustic indices which reflect content of ecological interest. An acoustic index is a statistic that summarizes some aspect of the distribution of acoustic energy in a recording. We combine indices to produce false-colour images that reveal acoustic content and facilitate navigation through recordings that are months or even years in duration.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A test of the useful field of view was introduced more than two decades ago and was designed to reflect the visual difficulties that older adults experience with everyday tasks. Importantly, the useful field of view is one of the most extensively researched and promising predictor tests for a range of driving outcomes measures, including driving ability and crash risk, as well as other everyday tasks. Currently available commercial versions of the test can be administered using personal computers and measure speed of visual processing speed for rapid detection and localization of targets under conditions of divided visual attention and in the presence and absence of visual clutter. The test is believed to assess higher order cognitive abilities, but performance also relies on visual sensory function since targets must be visible in order to be attended to. The format of the useful field of view test has been modified over the years; the original version estimated the spatial extent of useful field of view, while the latest versions measures visual processing speed. While deficits in the useful field of view are associated with functional impairments in everyday activities in older adults, there is also emerging evidence from several research groups that improvements in visual processing speed can be achieved through training. These improvements have been shown to reduce crash risk, and have a positive impact on health and functional well being, with the potential to increase the mobility and hence independence of older adults.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Incorporating a learner’s level of cognitive processing into Learning Analytics presents opportunities for obtaining rich data on the learning process. We propose a framework called COPA that provides a basis for mapping levels of cognitive operation into a learning analytics system. We utilise Bloom’s taxonomy, a theoretically respected conceptualisation of cognitive processing, and apply it in a flexible structure that can be implemented incrementally and with varying degree of complexity within an educational organisation. We outline how the framework is applied, and its key benefits and limitations. Finally, we apply COPA to a University undergraduate unit, and demonstrate its utility in identifying key missing elements in the structure of the course.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Sugar cane processing sites are characterised by high sugar/hemicellulose levels, available moisture and warm conditions, and are relatively unexplored unique microbial environments. The PhyloChip microarray was used to investigate bacterial diversity and community composition in three Australian sugar cane processing plants. These ecosystems were highly complex and dominated by four main Phyla, Firmicutes (the most dominant), followed by Proteobacteria, Bacteroidetes, and Chloroflexi. Significant variation (p , 0.05) in community structure occurred between samples collected from ‘floor dump sediment’, ‘cooling tower water’, and ‘bagasse leachate’. Many bacterial Classes contributed to these differences, however most were of low numerical abundance. Separation in community composition was also linked to Classes of Firmicutes, particularly Bacillales, Lactobacillales and Clostridiales, whose dominance is likely to be linked to their physiology as ‘lactic acid bacteria’, capable of fermenting the sugars present. This process may help displace other bacterial taxa, providing a competitive advantage for Firmicutes bacteria.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Energy auditing is an effective but costly approach for reducing the long-term energy consumption of buildings. When well-executed, energy loss can be quickly identified in the building structure and its subsystems. This then presents opportunities for improving energy efficiency. We present a low-cost, portable technology called "HeatWave" which allows non-experts to generate detailed 3D surface temperature models for energy auditing. This handheld 3D thermography system consists of two commercially available imaging sensors and a set of software algorithms which can be run on a laptop. The 3D model can be visualized in real-time by the operator so that they can monitor their degree of coverage as the sensors are used to capture data. In addition, results can be analyzed offline using the proposed "Spectra" multispectral visualization toolbox. The presence of surface temperature data in the generated 3D model enables the operator to easily identify and measure thermal irregularities such as thermal bridges, insulation leaks, moisture build-up and HVAC faults. Moreover, 3D models generated from subsequent audits of the same environment can be automatically compared to detect temporal changes in conditions and energy use over time.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis developed a method for real-time and handheld 3D temperature mapping using a combination of off-the-shelf devices and efficient computer algorithms. It contributes a new sensing and data processing framework to the science of 3D thermography, unlocking its potential for application areas such as building energy auditing and industrial monitoring. New techniques for the precise calibration of multi-sensor configurations were developed, along with several algorithms that ensure both accurate and comprehensive surface temperature estimates can be made for rich 3D models as they are generated by a non-expert user.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This research investigated strategies for motorway congestion management from a different angle: that is, how to quickly recover motorway from congestion at the end of peak hours, given congestion cannot be eliminated due to excessive demand during the long peak hours nowadays. The project developed a zone recovery strategy using ramp metering for rapid congestion recovery, and a serious of traffic simulation investigations were included to evaluate the developed strategy. The results, from both microscopic and macroscopic simulation, demonstrated the effectiveness of the zone recovery strategy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a novel framework for the unsupervised alignment of an ensemble of temporal sequences. This approach draws inspiration from the axiom that an ensemble of temporal signals stemming from the same source/class should have lower rank when "aligned" rather than "misaligned". Our approach shares similarities with recent state of the art methods for unsupervised images ensemble alignment (e.g. RASL) which breaks the problem into a set of image alignment problems (which have well known solutions i.e. the Lucas-Kanade algorithm). Similarly, we propose a strategy for decomposing the problem of temporal ensemble alignment into a similar set of independent sequence problems which we claim can be solved reliably through Dynamic Time Warping (DTW). We demonstrate the utility of our method using the Cohn-Kanade+ dataset, to align expression onset across multiple sequences, which allows us to automate the rapid discovery of event annotations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

MapReduce frameworks such as Hadoop are well suited to handling large sets of data which can be processed separately and independently, with canonical applications in information retrieval and sales record analysis. Rapid advances in sequencing technology have ensured an explosion in the availability of genomic data, with a consequent rise in the importance of large scale comparative genomics, often involving operations and data relationships which deviate from the classical Map Reduce structure. This work examines the application of Hadoop to patterns of this nature, using as our focus a wellestablished workflow for identifying promoters - binding sites for regulatory proteins - Across multiple gene regions and organisms, coupled with the unifying step of assembling these results into a consensus sequence. Our approach demonstrates the utility of Hadoop for problems of this nature, showing how the tyranny of the "dominant decomposition" can be at least partially overcome. It also demonstrates how load balance and the granularity of parallelism can be optimized by pre-processing that splits and reorganizes input files, allowing a wide range of related problems to be brought under the same computational umbrella.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Using Gray and McNaughton’s revised RST, this study investigated the extent to which the Behavioural Approach System (BAS) and the Fight-Flight-Freeze System (FFFS) influence the processing of gain-framed and loss-framed road safety messages and subsequent message acceptance. It was predicted that stronger BAS sensitivity and FFFS sensitivity would be associated with greater processing and acceptance of the gain-framed messages and loss-framed messages, respectively. Young drivers (N = 80, aged 17–25 years) viewed one of four road safety messages and completed a lexical decision task to assess message processing. Both self-report (e.g., Corr-Cooper RST-PQ) and behavioural measures (i.e., CARROT and Q-Task) were used to assess BAS and FFFS traits. Message acceptance was measured via self-report ratings of message effectiveness, behavioural intentions, attitudes and subsequent driving behaviour. The results are discussed in the context of the effect that differences in reward and punishment sensitivities may have on message processing and message acceptance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The purpose of this review is to showcase the present capabilities of ambient sampling and ionisation technologies for the analysis of polymers and polymer additives by mass spectrometry (MS) while simultaneously highlighting their advantages and limitations in a critical fashion. To qualify as an ambient ionisation technique, the method must be able to probe the surface of solid or liquid samples while operating in an open environment, allowing a variety of sample sizes, shapes, and substrate materials to be analysed. The main sections of this review will be guided by the underlying principle governing the desorption/extraction step of the analysis; liquid extraction, laser ablation, or thermal desorption, and the major component investigated, either the polymer itself or exogenous compounds (additives and contaminants) present within or on the polymer substrate. The review will conclude by summarising some of the challenges these technologies still face and possible directions that would further enhance the utility of ambient ionisation mass spectrometry as a tool for polymer analysis. (C) 2013 Elsevier B. V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

RATIONALE Both traditional electron ionization and electrospray ionization tandem mass spectrometry have demonstrated limitations in the unambiguous identification of fatty acids. In the former case, high electron energies lead to extensive dissociation of the radical cations from which little specific structural information can be obtained. In the latter, conventional collision-induced dissociation (CID) of even-electron ions provides little intra-chain fragmentation and thus few structural diagnostics. New approaches that harness the desirable features of both methods, namely radical-driven dissociation with discrete energy deposition, are thus required. METHODS Herein we describe the derivatization of a structurally diverse suite of fatty acids as 4-iodobenzyl esters (FAIBE). Electrospray ionization of these derivatives in the presence of sodium acetate yields abundant [M+Na]+ ions that can be mass-selected and subjected to laser irradiation (=266nm) on a modified linear ion-trap mass spectrometer. RESULTS Photodissociation (PD) of the FAIBE derivatives yields abundant radical cations by loss of atomic iodine and in several cases selective dissociation of activated carboncarbon bonds (e.g., at allylic positions) are also observed. Subsequent CID of the [M+NaI]center dot+ radical cations yields radical-directed dissociation (RDD) mass spectra that reveal extensive carboncarbon bond dissociation without scrambling of molecular information. CONCLUSIONS Both PD and RDD spectra obtained from derivatized fatty acids provide a wealth of structural information including the position(s) of unsaturation, chain-branching and hydroxylation. The structural information obtained by this approach, in particular the ability to rapidly differentiate isomeric lipids, represents a useful addition to the lipidomics tool box. Copyright (c) 2013 John Wiley & Sons, Ltd.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose. To establish a simple and rapid analytical method, based on direct insertion/electron ionization-mass spectrometry (DI/EI-MS), for measuring free cholesterol in tears from humans and rabbits. Methods. A stable-isotope dilution protocol employing DI/EI-MS in selected ion monitoring mode was developed and validated. It was used to quantify the free cholesterol content in human and rabbit tear extracts. Tears were collected from adult humans (n = 15) and rabbits (n = 10) and lipids extracted. Results. Screening, full-scan (m/z 40-600) DI/EI-MS analysis of crude tear extracts showed that diagnostic ions located in the mass range m/z 350 to 400 were those derived from free cholesterol, with no contribution from cholesterol esters. DI/EI-MS data acquired using selected ion monitoring (SIM) were analyzed for the abundance ratios of diagnostic ions with their stable isotope-labeled analogues arising from the D6-cholesterol internal standard. Standard curves of good linearity were produced and an on-probe limit of detection of 3 ng (at 3:1 signal to noise) and limit of quantification of 8 ng (at 10:1 signal to noise). The concentration of free cholesterol in human tears was 15 ± 6 μg/g, which was higher than in rabbit tears (10 ± 5 μg/g). Conclusions. A stable-isotope dilution DI/EI-SIM method for free cholesterol quantification without prior chromatographic separation was established. Using this method demonstrated that humans have higher free cholesterol levels in their tears than rabbits. This is in agreement with previous reports. This paper provides a rapid and reliable method to measure free cholesterol in small-volume clinical samples. © 2013 The Association for Research in Vision and Ophthalmology, Inc.