889 resultados para Method Evaluation
Resumo:
Current methods for estimating event-related potentials (ERPs) assume stationarity of the signal. Empirical Mode Decomposition (EMD) is a data-driven decomposition technique that does not assume stationarity. We evaluated an EMD-based method for estimating the ERP. On simulated data, EMD substantially reduced background EEG while retaining the ERP. EMD-denoised single trials also estimated shape, amplitude, and latency of the ERP better than raw single trials. On experimental data, EMD-denoised trials revealed event-related differences between two conditions (condition A and B) more effectively than trials lowpass filtered at 40 Hz. EMD also revealed event-related differences on both condition A and condition B that were clearer and of longer duration than those revealed by low-pass filtering at 40 Hz. Thus, EMD-based denoising is a promising data-driven, nonstationary method for estimating ERPs and should be investigated further.
Resumo:
Specific traditional plate count method and real-time PCR systems based on SYBR Green I and TaqMan technologies using a specific primer pair and probe for amplification of iap-gene were used for quantitative assay of Listeria monocytogenes in seven decimal serial dilution series of nutrient broth and milk samples containing 1.58 to 1.58×107 cfu /ml and the real-time PCR methods were compared with the plate count method with respect to accuracy and sensitivity. In this study, the plate count method was performed using surface-plating of 0.1 ml of each sample on Palcam Agar. The lowest detectable level for this method was 1.58×10 cfu/ml for both nutrient broth and milk samples. Using purified DNA as a template for generation of standard curves, as few as four copies of the iap-gene could be detected per reaction with both real-time PCR assays, indicating that they were highly sensitive. When these real-time PCR assays were applied to quantification of L. monocytogenes in decimal serial dilution series of nutrient broth and milk samples, 3.16×10 to 3.16×105 copies per reaction (equals to 1.58×103 to 1.58×107 cfu/ml L. monocytogenes) were detectable. As logarithmic cycles, for Plate Count and both molecular assays, the quantitative results of the detectable steps were similar to the inoculation levels.
Resumo:
The aim of this study was to investigate the antimicrobial properties of fifteen selected strains belonging to the Lactobacillus, Bifidobacterium, Lactococcus, Streptococcus and Bacillus genera against Gram-positive and Gram-negative pathogenic bacteria. In vitro antibacterial activity was initially investigated by an agar spot method. Results from the agar spot test showed that most of the selected strains were able to produce active compounds on solid media with antagonistic properties against Salmonella Typhimurium, Escherichia coli, Enterococcus faecalis, Staphylococcus aureus and Clostridium difficile. These results were also confirmed when cell-free culture supernatants (CFCS) from the putative probiotics were used in an agar well diffusion assay. Neutralization of the culture supernatants with alkali reduced the antagonistic effects. These experiments are able to confirm the capacity of potential probiotics to inhibit selected pathogens. One of the main inhibitory mechanisms may result from the production of organic acids from glucose fermentation and consequent lowering of culture pH. This observation was confirmed when the profile of organic acids was analysed demonstrating that lactic and acetic acid were the principal end products of probiotic metabolism. Furthermore, the assessment of the haemolytic activity and the susceptibility of the strains to the most commonly used antimicrobials, considered as basic safety aspects, were also studied. The observed antimicrobial activity was mainly genus-specific, additionally significant differences could be observed among species.
Resumo:
We present a new approach to determine palaeotemperatures (mean annual surface temperatures) based on measurements of the liquid–vapour homogenisation temperature of fluid inclusions in stalagmites. The aim of this study is to explore the potential and the limitations of this new palaeothermometer and to develop a reliable methodology for routine applications in palaeoclimate research. Therefore, we have investigated recent fluid inclusions from the top part of actively growing stalagmites that have formed at temperatures close to the present-day cave air temperature. A precondition for measuring homogenisation temperatures of originally monophase inclusions is the nucleation of a vapour bubble by means of single ultra-short laser pulses. Based on the observed homogenisation temperatures (Th(obs)) and measurements of the vapour bubble diameter at a known temperature, we calculated stalagmite formation temperatures (Tf) by applying a thermodynamic model that takes into account the effect of surface tension on liquid–vapour homogenisation. Results from recent stalagmite samples demonstrate that calculated stalagmite formation temperatures match the present-day cave air temperature within ± 0.2 °C. To avoid artificially induced changes of the fluid density we defined specific demands on the selection, handling and preparation of the stalagmite samples. Application of the method is restricted to stalagmites that formed at cave temperatures greater than ~ 9–11 °C.
Resumo:
Developments in high-throughput genotyping provide an opportunity to explore the application of marker technology in distinctness, uniformity and stability (DUS) testing of new varieties. We have used a large set of molecular markers to assess the feasibility of a UPOV Model 2 approach: “Calibration of threshold levels for molecular characteristics against the minimum distance in traditional characteristics”. We have examined 431 winter and spring barley varieties, with data from UK DUS trials comprising 28 characteristics, together with genotype data from 3072 SNP markers. Inter varietal distances were calculated and we found higher correlations between molecular and morphological distances than have been previously reported. When varieties were grouped by kinship, phenotypic and genotypic distances of these groups correlated well. We estimated the minimum marker numbers required and showed there was a ceiling after which the correlations do not improve. To investigate the possibility of breaking through this ceiling, we attempted genomic prediction of phenotypes from genotypes and higher correlations were achieved. We tested distinctness decisions made using either morphological or genotypic distances and found poor correspondence between each method.
Resumo:
A new technique for objective classification of boundary layers is applied to ground-based vertically pointing Doppler lidar and sonic anemometer data. The observed boundary layer has been classified into nine different types based on those in the Met Office ‘Lock’ scheme, using vertical velocity variance and skewness, along with attenuated backscatter coefficient and surface sensible heat flux. This new probabilistic method has been applied to three years of data from Chilbolton Observatory in southern England and a climatology of boundary-layer type has been created. A clear diurnal cycle is present in all seasons. The most common boundary-layer type is stable with no cloud (30.0% of the dataset). The most common unstable type is well mixed with no cloud (15.4%). Decoupled stratocumulus is the third most common boundary-layer type (10.3%) and cumulus under stratocumulus occurs 1.0% of the time. The occurrence of stable boundary-layer types is much higher in the winter than the summer and boundary-layer types capped with cumulus cloud are more prevalent in the warm seasons. The most common diurnal evolution of boundary-layer types, occurring on 52 days of our three-year dataset, is that of no cloud with the stability changing from stable to unstable during daylight hours. These results are based on 16393 hours, 62.4% of the three-year dataset, of diagnosed boundary-layer type. This new method is ideally suited to long-term evaluation of boundary-layer type parametrisations in weather forecast and climate models.
Resumo:
Keyphrases are added to documents to help identify the areas of interest they contain. However, in a significant proportion of papers author selected keyphrases are not appropriate for the document they accompany: for instance, they can be classificatory rather than explanatory, or they are not updated when the focus of the paper changes. As such, automated methods for improving the use of keyphrases are needed, and various methods have been published. However, each method was evaluated using a different corpus, typically one relevant to the field of study of the method’s authors. This not only makes it difficult to incorporate the useful elements of algorithms in future work, but also makes comparing the results of each method inefficient and ineffective. This paper describes the work undertaken to compare five methods across a common baseline of corpora. The methods chosen were Term Frequency, Inverse Document Frequency, the C-Value, the NC-Value, and a Synonym based approach. These methods were analysed to evaluate performance and quality of results, and to provide a future benchmark. It is shown that Term Frequency and Inverse Document Frequency were the best algorithms, with the Synonym approach following them. Following these findings, a study was undertaken into the value of using human evaluators to judge the outputs. The Synonym method was compared to the original author keyphrases of the Reuters’ News Corpus. The findings show that authors of Reuters’ news articles provide good keyphrases but that more often than not they do not provide any keyphrases.
Resumo:
Nowadays the changing environment becomes the main challenge for most of organizations, since they have to evaluate proper policies to adapt to the environment. In this paper, we propose a multi-agent simulation method to evaluate policies based on complex adaptive system theory. Furthermore, we propose a semiotic EDA (Epistemic, Deontic, Axiological) agent model to simulate agent's behavior in the system by incorporating the social norms reflecting the policy. A case study is also provided to validate our approach. Our research present better adaptability and validity than the qualitative analysis and experiment approach and the semiotic agent model provides high creditability to simulate agents' behavior.
Resumo:
A new online method to analyse water isotopes of speleothem fluid inclusions using a wavelength scanned cavity ring down spectroscopy (WS-CRDS) instrument is presented. This novel technique allows us simultaneously to measure hydrogen and oxygen isotopes for a released aliquot of water. To do so, we designed a new simple line that allows the online water extraction and isotope analysis of speleothem samples. The specificity of the method lies in the fact that fluid inclusions release is made on a standard water background, which mainly improves the δ D robustness. To saturate the line, a peristaltic pump continuously injects standard water into the line that is permanently heated to 140 °C and flushed with dry nitrogen gas. This permits instantaneous and complete vaporisation of the standard water, resulting in an artificial water background with well-known δ D and δ18O values. The speleothem sample is placed in a copper tube, attached to the line, and after system stabilisation it is crushed using a simple hydraulic device to liberate speleothem fluid inclusions water. The released water is carried by the nitrogen/standard water gas stream directly to a Picarro L1102-i for isotope determination. To test the accuracy and reproducibility of the line and to measure standard water during speleothem measurements, a syringe injection unit was added to the line. Peak evaluation is done similarly as in gas chromatography to obtain &delta D; and δ18O isotopic compositions of measured water aliquots. Precision is better than 1.5 ‰ for δ D and 0.4 ‰ for δ18O for water measurements for an extended range (−210 to 0 ‰ for δ D and −27 to 0 ‰ for δ18O) primarily dependent on the amount of water released from speleothem fluid inclusions and secondarily on the isotopic composition of the sample. The results show that WS-CRDS technology is suitable for speleothem fluid inclusion measurements and gives results that are comparable to the isotope ratio mass spectrometry (IRMS) technique.
Resumo:
The decision to close airspace in the event of a volcanic eruption is based on hazard maps of predicted ash extent. These are produced using output from volcanic ash transport and dispersion (VATD)models. In this paper an objectivemetric to evaluate the spatial accuracy of VATD simulations relative to satellite retrievals of volcanic ash is presented. The 5 metric is based on the fractions skill score (FSS). Thismeasure of skill provides more information than traditional point-bypoint metrics, such as success index and Pearson correlation coefficient, as it takes into the account spatial scale overwhich skill is being assessed. The FSS determines the scale overwhich a simulation has skill and can differentiate between a "near miss" and a forecast that is badly misplaced. The 10 idealised scenarios presented show that even simulations with considerable displacement errors have useful skill when evaluated over neighbourhood scales of 200–700km2. This method could be used to compare forecasts produced by different VATDs or using different model parameters, assess the impact of assimilating satellite retrieved ash data and evaluate VATD forecasts over a long time period.
Resumo:
Georeferencing is one of the major tasks of satellite-borne remote sensing. Compared to traditional indirect methods, direct georeferencing through a Global Positioning System/inertial navigation system requires fewer and simpler steps to obtain exterior orientation parameters of remotely sensed images. However, the pixel shift caused by geographic positioning error, which is generally derived from boresight angle as well as terrain topography variation, can have a great impact on the precision of georeferencing. The distribution of pixel shifts introduced by the positioning error on a satellite linear push-broom image is quantitatively analyzed. We use the variation of the object space coordinate to simulate different kinds of positioning errors and terrain topography. Then a total differential method was applied to establish a rigorous sensor model in order to mathematically obtain the relationship between pixel shift and positioning error. Finally, two simulation experiments are conducted using the imaging parameters of Chang’ E-1 satellite to evaluate two different kinds of positioning errors. The experimental results have shown that with the experimental parameters, the maximum pixel shift could reach 1.74 pixels. The proposed approach can be extended to a generic application for imaging error modeling in remote sensing with terrain variation.
Resumo:
While several privacy protection techniques are pre- sented in the literature, they are not complemented with an established objective evaluation method for their assess- ment and comparison. This paper proposes an annotation- free evaluation method that assesses the two key aspects of privacy protection that are privacy and utility. Unlike some existing methods, the proposed method does not rely on the use of subjective judgements and does not assume a spe- cific target type in the image data. The privacy aspect is quantified as an appearance similarity and the utility aspect is measured as a structural similarity between the original raw image data and the privacy-protected image data. We performed an extensive experimentation using six challeng- ing datasets (including two new ones) to demonstrate the effectiveness of the evaluation method by providing a per- formance comparison of four state-of-the-art privacy pro- tection techniques.
Resumo:
Designing surgical instruments for robotic-assisted minimally-invasive surgery (RAMIS) is challenging due to constraints on the number and type of sensors imposed by considerations such as space or the need for sterilization. A new method for evaluating the usability of virtual teleoperated surgical instruments based on virtual sensors is presented. This method uses virtual prototyping of the surgical instrument with a dual physical interaction, which allows testing of different sensor configurations in a real environment. Moreover, the proposed approach has been applied to the evaluation of prototypes of a two-finger grasper for lump detection by remote pinching. In this example, the usability of a set of five different sensor configurations, with a different number of force sensors, is evaluated in terms of quantitative and qualitative measures in clinical experiments with 23 volunteers. As a result, the smallest number of force sensors needed in the surgical instrument that ensures the usability of the device can be determined. The details of the experimental setup are also included.
Resumo:
Quantitative palaeoclimate reconstructions are widely used to evaluate climatemodel performance. Here, as part of an effort to provide such a data set for Australia, we examine the impact of analytical decisions and sampling assumptions on modern-analogue reconstructions using a continent-wide pollen data set. There is a high degree of correlation between temperature variables in the modern climate of Australia, but there is sufficient orthogonality in the variations of precipitation, summer and winter temperature and plant–available moisture to allow independent reconstructions of these four variables to be made. The method of analogue selection does not affect the reconstructions, although bootstrap resampling provides a more reliable technique for obtaining robust measures of uncertainty. The number of analogues used affects the quality of the reconstructions: the most robust reconstructions are obtained using 5 analogues. The quality of reconstructions based on post-1850 CE pollen samples differ little from those using samples from between 1450 and 1849 CE, showing that European post settlement modification of vegetation has no impact on the fidelity of the reconstructions although it substantially increases the availability of potential analogues. Reconstructions based on core top samples are more realistic than those using surface samples, but only using core top samples would substantially reduce the number of available analogues and therefore increases the uncertainty of the reconstructions. Spatial and/or temporal averaging of pollen assemblages prior to analysis negatively affects the subsequent reconstructions for some variables and increases the associated uncertainties. In addition, the quality of the reconstructions is affected by the degree of spatial smoothing of the original climate data, with the best reconstructions obtained using climate data froma 0.5° resolution grid, which corresponds to the typical size of the pollen catchment. This study provides a methodology that can be used to provide reliable palaeoclimate reconstructions for Australia, which will fill in a major gap in the data sets used to evaluate climate models.
Resumo:
Human Body Thermoregulation Models have been widely used in the field of human physiology or thermal comfort studies. However there are few studies on the evaluation method for these models. This paper summarises the existing evaluation methods and critically analyses the flaws. Based on that, a method for the evaluating the accuracy of the Human Body Thermoregulation models is proposed. The new evaluation method contributes to the development of Human Body Thermoregulation models and validates their accuracy both statistically and empirically. The accuracy of different models can be compared by the new method. Furthermore, the new method is not only suitable for the evaluation of Human Body Thermoregulation Models, but also can be theoretically applied to the evaluation of the accuracy of the population-based models in other research fields.