56 resultados para comparison method
Resumo:
Sensitive methods that are currently used to monitor proteolysis by plasmin in milk are limited due to 7 their high cost and lack of standardisation for quality assurance in the various dairy laboratories. In 8 this study, four methods, trinitrobenzene sulphonic acid (TNBS), reverse phase high pressure liquid 9 chromatography (RP-HPLC), gel electrophoresis and fluorescamine, were selected to assess their 10 suitability for the detection of proteolysis in milk by plasmin. Commercial UHT milk was incubated 11 with plasmin at 37 °C for one week. Clarification was achieved by isoelectric precipitation (pH 4·6 12 soluble extracts)or 6% (final concentration) trichloroacetic acid (TCA). The pH 4·6 and 6% TCA 13 soluble extracts of milk showed high correlations (R2 > 0·93) by the TNBS, fluorescamine and 14 RP-HPLC methods, confirming increased proteolysis during storage. For gel electrophoresis,15 extensive proteolysis was confirmed by the disappearance of α- and β-casein bands on the seventh 16 day, which was more evident in the highest plasmin concentration. This was accompanied by the 17 appearance of α- and β-casein proteolysis products with higher intensities than on previous days, 18 implying that more products had been formed as a result of casein breakdown. The fluorescamine 19 method had a lower detection limit compared with the other methods, whereas gel electrophoresis 20 was the best qualitative method for monitoring β-casein proteolysis products. Although HPLC was the 21 most sensitive, the TNBS method is recommended for use in routine laboratory analysis on the basis 22 of its accuracy, reliability and simplicity.
Resumo:
Integrated Arable Farming Systems (IAFS), which involve a reduction in the use of off-farm inputs, are attracting considerable research interest in the UK. The objectives of these systems experiments are to compare their financial performance with that from conventional or current farming practices. To date, this comparison has taken little account of any environmental benefits (or disbenefits) of the two systems. The objective of this paper is to review the assessment methodologies available for the analysis of environmental impacts. To illustrate the results of this exercise, the methodology and environmental indicators chosen are then applied to data from one of the LINK - Integrated Farming Systems experimental sites. Data from the Pathhead site in Southern Scotland are used to evaluate the use of invertebrates and nitrate loss as environmental indicators within IAFS. The results suggest that between 1992 and 1995 the biomass of earthworms fell by 28 kg per hectare on the integrated rotation and rose by 31 kg per hectare on the conventional system. This led to environmental costs ranging between £2.24 and £13.44 per hectare for the integrated system and gains of between £2.48 and £14.88 for the conventional system. In terms of nitrate, the integrated system had an estimated loss of £72.21 per hectare in comparison to £149.40 per hectare on the conventional system. Conclusions are drawn about the advantages and disadvantages of this type of analytical framework. Keywords: Farming systems; IAFS; Environmental valuation; Economics; Earthworms; Nitrates; Soil fauna
Resumo:
A Bond Graph is a graphical modelling technique that allows the representation of energy flow between the components of a system. When used to model power electronic systems, it is necessary to incorporate bond graph elements to represent a switch. In this paper, three different methods of modelling switching devices are compared and contrasted: the Modulated Transformer with a binary modulation ratio (MTF), the ideal switch element, and the Switched Power Junction (SPJ) method. These three methods are used to model a dc-dc Boost converter and then run simulations in MATLAB/SIMULINK. To provide a reference to compare results, the converter is also simulated using PSPICE. Both quantitative and qualitative comparisons are made to determine the suitability of each of the three Bond Graph switch models in specific power electronics applications
Resumo:
There are many published methods available for creating keyphrases for documents. Previous work in the field has shown that in a significant proportion of cases author selected keyphrases are not appropriate for the document they accompany. This requires the use of such automated methods to improve the use of keyphrases. Often the keyphrases are not updated when the focus of a paper changes or include keyphrases that are more classificatory than explanatory. The published methods are all evaluated using different corpora, typically one relevant to their field of study. This not only makes it difficult to incorporate the useful elements of algorithms in future work but also makes comparing the results of each method inefficient and ineffective. This paper describes the work undertaken to compare five methods across a common baseline of six corpora. The methods chosen were term frequency, inverse document frequency, the C-Value, the NC-Value, and a synonym based approach. These methods were compared to evaluate performance and quality of results, and to provide a future benchmark. It is shown that, with the comparison metric used for this study Term Frequency and Inverse Document Frequency were the best algorithms, with the synonym based approach following them. Further work in the area is required to determine an appropriate (or more appropriate) comparison metric.
Resumo:
Land surface albedo is dependent on atmospheric state and hence is difficult to validate. Over the UK persistent cloud cover and land cover heterogeneity at moderate (km-scale) spatial resolution can also complicate comparison of field-measured albedo with that derived from instruments such as the Moderate Resolution Imaging Spectrometer (MODIS). A practical method of comparing moderate resolution satellite-derived albedo with ground-based measurements over an agricultural site in the UK is presented. Point measurements of albedo made on the ground are scaled up to the MODIS resolution (1 km) through reflectance data obtained at a range of spatial scales. The point measurements of albedo agreed in magnitude with MODIS values over the test site to within a few per cent, despite problems such as persistent cloud cover and the difficulties of comparing measurements made during different years. Albedo values derived from airborne and field-measured data were generally lower than the corresponding satellite-derived values. This is thought to be due to assumptions made regarding the ratio of direct to diffuse illumination used when calculating albedo from reflectance. Measurements of albedo calculated for specific times fitted closely to the trajectories of temporal albedo derived from both Systeme pour l'Observation de la Terre (SPOT) Vegetation (VGT) and MODIS instruments.
Resumo:
Specific traditional plate count method and real-time PCR systems based on SYBR Green I and TaqMan technologies using a specific primer pair and probe for amplification of iap-gene were used for quantitative assay of Listeria monocytogenes in seven decimal serial dilution series of nutrient broth and milk samples containing 1.58 to 1.58×107 cfu /ml and the real-time PCR methods were compared with the plate count method with respect to accuracy and sensitivity. In this study, the plate count method was performed using surface-plating of 0.1 ml of each sample on Palcam Agar. The lowest detectable level for this method was 1.58×10 cfu/ml for both nutrient broth and milk samples. Using purified DNA as a template for generation of standard curves, as few as four copies of the iap-gene could be detected per reaction with both real-time PCR assays, indicating that they were highly sensitive. When these real-time PCR assays were applied to quantification of L. monocytogenes in decimal serial dilution series of nutrient broth and milk samples, 3.16×10 to 3.16×105 copies per reaction (equals to 1.58×103 to 1.58×107 cfu/ml L. monocytogenes) were detectable. As logarithmic cycles, for Plate Count and both molecular assays, the quantitative results of the detectable steps were similar to the inoculation levels.
Resumo:
In this study, we compare two different cyclone-tracking algorithms to detect North Atlantic polar lows, which are very intense mesoscale cyclones. Both approaches include spatial filtering, detection, tracking and constraints specific to polar lows. The first method uses digital bandpass-filtered mean sea level pressure (MSLP) fieldsin the spatial range of 200�600 km and is especially designed for polar lows. The second method also uses a bandpass filter but is based on the discrete cosine transforms (DCT) and can be applied to MSLP and vorticity fields. The latter was originally designed for cyclones in general and has been adapted to polar lows for this study. Both algorithms are applied to the same regional climate model output fields from October 1993 to September 1995 produced from dynamical downscaling of the NCEP/NCAR reanalysis data. Comparisons between these two methods show that different filters lead to different numbers and locations of tracks. The DCT is more precise in scale separation than the digital filter and the results of this study suggest that it is more suited for the bandpass filtering of MSLP fields. The detection and tracking parts also influence the numbers of tracks although less critically. After a selection process that applies criteria to identify tracks of potential polar lows, differences between both methods are still visible though the major systems are identified in both.
Resumo:
If acid-sensitive drugs or cells are administered orally, there is often a reduction in efficacy associated with gastric passage. Formulation into a polymer matrix is a potential method to improve their stability. The visualization of pH within these materials may help better understand the action of these polymer systems and allow comparison of different formulations. We herein describe the development of a novel confocal laser-scanning microscopy (CLSM) method for visualizing pH changes within polymer matrices and demonstrate its applicability to an enteric formulation based on chitosan-coated alginate gels. The system in question is first shown to protect an acid-sensitive bacterial strain to low pH, before being studied by our technique. Prior to this study, it has been claimed that protection by these materials is a result of buffering, but this has not been demonstrated. The visualization of pH within these matrices during exposure to a pH 2.0 simulated gastric solution showed an encroachment of acid from the periphery of the capsule, and a persistence of pHs above 2.0 within the matrix. This implies that the protective effect of the alginate-chitosan matrices is most likely due to a combination of buffering of acid as it enters the polymer matrix and the slowing of acid penetration.
Resumo:
Keyphrases are added to documents to help identify the areas of interest they contain. However, in a significant proportion of papers author selected keyphrases are not appropriate for the document they accompany: for instance, they can be classificatory rather than explanatory, or they are not updated when the focus of the paper changes. As such, automated methods for improving the use of keyphrases are needed, and various methods have been published. However, each method was evaluated using a different corpus, typically one relevant to the field of study of the method’s authors. This not only makes it difficult to incorporate the useful elements of algorithms in future work, but also makes comparing the results of each method inefficient and ineffective. This paper describes the work undertaken to compare five methods across a common baseline of corpora. The methods chosen were Term Frequency, Inverse Document Frequency, the C-Value, the NC-Value, and a Synonym based approach. These methods were analysed to evaluate performance and quality of results, and to provide a future benchmark. It is shown that Term Frequency and Inverse Document Frequency were the best algorithms, with the Synonym approach following them. Following these findings, a study was undertaken into the value of using human evaluators to judge the outputs. The Synonym method was compared to the original author keyphrases of the Reuters’ News Corpus. The findings show that authors of Reuters’ news articles provide good keyphrases but that more often than not they do not provide any keyphrases.
Resumo:
We present an efficient graph-based algorithm for quantifying the similarity of household-level energy use profiles, using a notion of similarity that allows for small time–shifts when comparing profiles. Experimental results on a real smart meter data set demonstrate that in cases of practical interest our technique is far faster than the existing method for computing the same similarity measure. Having a fast algorithm for measuring profile similarity improves the efficiency of tasks such as clustering of customers and cross-validation of forecasting methods using historical data. Furthermore, we apply a generalisation of our algorithm to produce substantially better household-level energy use forecasts from historical smart meter data.
Resumo:
The performance of real estate investment markets is difficult to monitor because the constituent assets are heterogeneous, are traded infrequently and do not trade through a central exchange in which prices can be observed. To address this, appraisal based indices have been developed that use the records of owners for whom buildings are regularly re-valued. These indices provide a practical solution to the measurement problem, but have been criticised for understating volatility and not capturing market turning points in a timely manner. This paper evaluates alternative ‘Transaction Linked Indices’ that are estimated using an extension of the hedonic method for index construction and with Investment Property Databank data. The two types of indices are compared over Q4 2001 to Q4 2012 in order to examine whether movements in these indices are consistent. The Transaction Linked Indices show stronger growth and sharper declines than their appraisal based counterparts over the course of the cycle in different European markets and they are typically two to four times more volatile. However, they have some limitations; for instance, only country level indicators can be published in many cases owing to low trading volumes in the period studied.
Resumo:
Whole-genome transcriptome profiling is revealing how biological systems are regulated at the transcriptional level. This study reports the development of a robust method to profile and compare the transcriptomes of two nonmodel plant species, Thlaspi caerulescens, a zinc (Zn) hyperaccumulator, and Thlaspi arvense, a nonhyperaccumulator, using Affymetrix Arabidopsis thaliana ATH1-121501 GeneChip (R) arrays (Affymetrix, Santa Clara, CA, USA). Transcript abundance was quantified in the shoots of agar- and compost-grown plants of both species. Analyses were optimized using a genomic DNA (gDNA)-based probe-selection strategy based on the hybridization efficiency of Thlaspi gDNA with corresponding A. thaliana probes. In silico alignments of GeneChip (R) probes with Thlaspi gene sequences, and quantitative real-time PCR, confirmed the validity of this approach. Approximately 5000 genes were differentially expressed in the shoots of T. caerulescens compared with T. arvense, including genes involved in Zn transport and compartmentalization. Future functional analyses of genes identified as differentially expressed in the shoots of these closely related species will improve our understanding of the molecular mechanisms of Zn hyperaccumulation.
Resumo:
In this paper a modified algorithm is suggested for developing polynomial neural network (PNN) models. Optimal partial description (PD) modeling is introduced at each layer of the PNN expansion, a task accomplished using the orthogonal least squares (OLS) method. Based on the initial PD models determined by the polynomial order and the number of PD inputs, OLS selects the most significant regressor terms reducing the output error variance. The method produces PNN models exhibiting a high level of accuracy and superior generalization capabilities. Additionally, parsimonious models are obtained comprising a considerably smaller number of parameters compared to the ones generated by means of the conventional PNN algorithm. Three benchmark examples are elaborated, including modeling of the gas furnace process as well as the iris and wine classification problems. Extensive simulation results and comparison with other methods in the literature, demonstrate the effectiveness of the suggested modeling approach.
Resumo:
Tests of the new Rossby wave theories that have been developed over the past decade to account for discrepancies between theoretical wave speeds and those observed by satellite altimeters have focused primarily on the surface signature of such waves. It appears, however, that the surface signature of the waves acts only as a rather weak constraint, and that information on the vertical structure of the waves is required to better discriminate between competing theories. Due to the lack of 3-D observations, this paper uses high-resolution model data to construct realistic vertical structures of Rossby waves and compares these to structures predicted by theory. The meridional velocity of a section at 24° S in the Atlantic Ocean is pre-processed using the Radon transform to select the dominant westward signal. Normalized profiles are then constructed using three complementary methods based respectively on: (1) averaging vertical profiles of velocity, (2) diagnosing the amplitude of the Radon transform of the westward propagating signal at different depths, and (3) EOF analysis. These profiles are compared to profiles calculated using four different Rossby wave theories: standard linear theory (SLT), SLT plus mean flow, SLT plus topographic effects, and theory including mean flow and topographic effects. Our results support the classical theoretical assumption that westward propagating signals have a well-defined vertical modal structure associated with a phase speed independent of depth, in contrast with the conclusions of a recent study using the same model but for different locations in the North Atlantic. The model structures are in general surface intensified, with a sign reversal at depth in some regions, notably occurring at shallower depths in the East Atlantic. SLT provides a good fit to the model structures in the top 300 m, but grossly overestimates the sign reversal at depth. The addition of mean flow slightly improves the latter issue, but is too surface intensified. SLT plus topography rectifies the overestimation of the sign reversal, but overestimates the amplitude of the structure for much of the layer above the sign reversal. Combining the effects of mean flow and topography provided the best fit for the mean model profiles, although small errors at the surface and mid-depths are carried over from the individual effects of mean flow and topography respectively. Across the section the best fitting theory varies between SLT plus topography and topography with mean flow, with, in general, SLT plus topography performing better in the east where the sign reversal is less pronounced. None of the theories could accurately reproduce the deeper sign reversals in the west. All theories performed badly at the boundaries. The generalization of this method to other latitudes, oceans, models and baroclinic modes would provide greater insight into the variability in the ocean, while better observational data would allow verification of the model findings.
Resumo:
Two different TAMSAT (Tropical Applications of Meteorological Satellites) methods of rainfall estimation were developed for northern and southern Africa, based on Meteosat images. These two methods were used to make rainfall estimates for the southern rainy season from October 1995 to April 1996. Estimates produced by both TAMSAT methods and estimates produced by the CPC (Climate Prediction Center) method were then compared with kriged data from over 800 raingauges in southern Africa. This shows that operational TAMSAT estimates are better over plateau regions, with 59% of estimates within one standard error (s.e.) of the kriged rainfall. Over mountainous regions the CPC approach is generally better, although all methods underestimate and give only 40% of estimates within 1 s.e. The two TAMSAT methods show little difference across a whole season, but when looked at in detail the northern method gives unsatisfactory calibrations. The CPC method does have significant overall improvements by building in real-time raingauge data, but only where sufficient raingauges are available.