28 resultados para QUALITY-CONTROL GUIDELINES


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Dielectric properties of 16 process cheeses were determined over the frequency range 0.3-3 GHz. The effect of temperature on the dielectric properties of process cheeses were investigated at temperature intervals of 10 degrees C between 5 and 85 degrees C. Results showed that the dielectric constant decreased gradually as frequency increased, for all cheeses. The dielectric loss factor (epsilon") decreased from above 125 to below 12 as frequency increased. epsilon' was highest at 5 degrees C and generally decreased up to a temperature between 55 and 75 degrees C. epsilon" generally increased with increasing temperature for high and medium moisture/fat ratio cheeses. epsilon" decreased with temperature between 5 and 55 degrees C and then increased, for low moisture/fat ratio cheese. Partial least square regression models indicated that epsilon' and epsilon" could be used as a quality control screening application to measure moisture content and inorganic salt content of process cheese, respectively. (c) 2005 Elsevier Ltd. All rights reserved..

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper we explore classification techniques for ill-posed problems. Two classes are linearly separable in some Hilbert space X if they can be separated by a hyperplane. We investigate stable separability, i.e. the case where we have a positive distance between two separating hyperplanes. When the data in the space Y is generated by a compact operator A applied to the system states ∈ X, we will show that in general we do not obtain stable separability in Y even if the problem in X is stably separable. In particular, we show this for the case where a nonlinear classification is generated from a non-convergent family of linear classes in X. We apply our results to the problem of quality control of fuel cells where we classify fuel cells according to their efficiency. We can potentially classify a fuel cell using either some external measured magnetic field or some internal current. However we cannot measure the current directly since we cannot access the fuel cell in operation. The first possibility is to apply discrimination techniques directly to the measured magnetic fields. The second approach first reconstructs currents and then carries out the classification on the current distributions. We show that both approaches need regularization and that the regularized classifications are not equivalent in general. Finally, we investigate a widely used linear classification algorithm Fisher's linear discriminant with respect to its ill-posedness when applied to data generated via a compact integral operator. We show that the method cannot stay stable when the number of measurement points becomes large.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Data assimilation algorithms are a crucial part of operational systems in numerical weather prediction, hydrology and climate science, but are also important for dynamical reconstruction in medical applications and quality control for manufacturing processes. Usually, a variety of diverse measurement data are employed to determine the state of the atmosphere or to a wider system including land and oceans. Modern data assimilation systems use more and more remote sensing data, in particular radiances measured by satellites, radar data and integrated water vapor measurements via GPS/GNSS signals. The inversion of some of these measurements are ill-posed in the classical sense, i.e. the inverse of the operator H which maps the state onto the data is unbounded. In this case, the use of such data can lead to significant instabilities of data assimilation algorithms. The goal of this work is to provide a rigorous mathematical analysis of the instability of well-known data assimilation methods. Here, we will restrict our attention to particular linear systems, in which the instability can be explicitly analyzed. We investigate the three-dimensional variational assimilation and four-dimensional variational assimilation. A theory for the instability is developed using the classical theory of ill-posed problems in a Banach space framework. Further, we demonstrate by numerical examples that instabilities can and will occur, including an example from dynamic magnetic tomography.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The formulation and performance of the Met Office visibility analysis and prediction system are described. The visibility diagnostic within the limited-area Unified Model is a function of humidity and a prognostic aerosol content. The aerosol model includes advection, industrial and general urban sources, plus boundary-layer mixing and removal by rain. The assimilation is a 3-dimensional variational scheme in which the visibility observation operator is a very nonlinear function of humidity, aerosol and temperature. A quality control scheme for visibility data is included. Visibility observations can give rise to humidity increments of significant magnitude compared with the direct impact of humidity observations. We present the results of sensitivity studies which show the contribution of different components of the system to improved skill in visibility forecasts. Visibility assimilation is most important within the first 6-12 hours of the forecast and for visibilities below 1 km, while modelling of aerosol sources and advection is important for slightly higher visibilities (1-5 km) and is still significant at longer forecast times

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A recently developed capillary electrophoresis (CE)-negative-ionisation mass spectrometry (MS) method was used to profile anionic metabolites in a microbial-host co-metabolism study. Urine samples from rats receiving antibiotics (penicillin G and streptomycin sulfate) for 0, 4, or 8 days were analysed. A quality control sample was measured repeatedly to monitor the performance of the applied CE-MS method. After peak alignment, relative standard deviations (RSDs) for migration time of five representative compounds were below 0.4 %, whereas RSDs for peak area were 7.9–13.5 %. Using univariate and principal component analysis of obtained urinary metabolic profiles, groups of rats receiving different antibiotic treatment could be distinguished based on 17 discriminatory compounds, of which 15 were downregulated and 2 were upregulated upon treatment. Eleven compounds remained down- or upregulated after discontinuation of the antibiotics administration, whereas a recovery effect was observed for others. Based on accurate mass, nine compounds were putatively identified; these included the microbial-mammalian co-metabolites hippuric acid and indoxyl sulfate. Some discriminatory compounds were also observed by other analytical techniques, but CE-MS uniquely revealed ten metabolites modulated by antibiotic exposure, including aconitic acid and an oxocholic acid. This clearly demonstrates the added value of CE-MS for nontargeted profiling of small anionic metabolites in biological samples.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Anthropogenic emissions of heat and exhaust gases play an important role in the atmospheric boundary layer, altering air quality, greenhouse gas concentrations and the transport of heat and moisture at various scales. This is particularly evident in urban areas where emission sources are integrated in the highly heterogeneous urban canopy layer and directly linked to human activities which exhibit significant temporal variability. It is common practice to use eddy covariance observations to estimate turbulent surface fluxes of latent heat, sensible heat and carbon dioxide, which can be attributed to a local scale source area. This study provides a method to assess the influence of micro-scale anthropogenic emissions on heat, moisture and carbon dioxide exchange in a highly urbanized environment for two sites in central London, UK. A new algorithm for the Identification of Micro-scale Anthropogenic Sources (IMAS) is presented, with two aims. Firstly, IMAS filters out the influence of micro-scale emissions and allows for the analysis of the turbulent fluxes representative of the local scale source area. Secondly, it is used to give a first order estimate of anthropogenic heat flux and carbon dioxide flux representative of the building scale. The algorithm is evaluated using directional and temporal analysis. The algorithm is then used at a second site which was not incorporated in its development. The spatial and temporal local scale patterns, as well as micro-scale fluxes, appear physically reasonable and can be incorporated in the analysis of long-term eddy covariance measurements at the sites in central London. In addition to the new IMAS-technique, further steps in quality control and quality assurance used for the flux processing are presented. The methods and results have implications for urban flux measurements in dense urbanised settings with significant sources of heat and greenhouse gases.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We describe the approach to be adopted for a major new initiative to derive a homogeneous record of sea surface temperature for 1991–2007 from the observations of the series of three along-track scanning radiometers (ATSRs). This initiative is called (A)RC: (Advanced) ATSR Re-analysis for Climate. The main objectives are to reduce regional biases in retrieved sea surface temperature (SST) to less than 0.1 K for all global oceans, while creating a very homogenous record that is stable in time to within 0.05 K decade−1, with maximum independence of the record from existing analyses of SST used in climate change research. If these stringent targets are achieved, this record will enable significantly improved estimates of surface temperature trends and variability of sufficient quality to advance questions of climate change attribution, climate sensitivity and historical reconstruction of surface temperature changes. The approach includes development of new, consistent estimators for SST for each of the ATSRs, and detailed analysis of overlap periods. Novel aspects of the approach include generation of multiple versions of the record using alternative channel sets and cloud detection techniques, to assess for the first time the effect of such choices. There will be extensive effort in quality control, validation and analysis of the impact on climate SST data sets. Evidence for the plausibility of the 0.1 K target for systematic error is reviewed, as is the need for alternative cloud screening methods in this context.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND: Single nucleotide polymorphisms (SNPs) in genes encoding the components involved in the hypothalamic pathway may influence weight gain and dietary factors may modify their effects. AIM: We conducted a case-cohort study to investigate the associations of SNPs in candidate genes with weight change during an average of 6.8 years of follow-up and to examine the potential effect modification by glycemic index (GI) and protein intake. METHODS AND FINDINGS: Participants, aged 20-60 years at baseline, came from five European countries. Cases ('weight gainers') were selected from the total eligible cohort (n = 50,293) as those with the greatest unexplained annual weight gain (n = 5,584). A random subcohort (n = 6,566) was drawn with the intention to obtain an equal number of cases and noncases (n = 5,507). We genotyped 134 SNPs that captured all common genetic variation across the 15 candidate genes; 123 met the quality control criteria. Each SNP was tested for association with the risk of being a 'weight gainer' (logistic regression models) in the case-noncase data and with weight gain (linear regression models) in the random subcohort data. After accounting for multiple testing, none of the SNPs was significantly associated with weight change. Furthermore, we observed no significant effect modification by dietary factors, except for SNP rs7180849 in the neuromedin β gene (NMB). Carriers of the minor allele had a more pronounced weight gain at a higher GI (P = 2 x 10⁻⁷). CONCLUSIONS: We found no evidence of association between SNPs in the studied hypothalamic genes with weight change. The interaction between GI and NMB SNP rs7180849 needs further confirmation.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Global syntheses of palaeoenvironmental data are required to test climate models under conditions different from the present. Data sets for this purpose contain data from spatially extensive networks of sites. The data are either directly comparable to model output or readily interpretable in terms of modelled climate variables. Data sets must contain sufficient documentation to distinguish between raw (primary) and interpreted (secondary, tertiary) data, to evaluate the assumptions involved in interpretation of the data, to exercise quality control, and to select data appropriate for specific goals. Four data bases for the Late Quaternary, documenting changes in lake levels since 30 kyr BP (the Global Lake Status Data Base), vegetation distribution at 18 kyr and 6 kyr BP (BIOME 6000), aeolian accumulation rates during the last glacial-interglacial cycle (DIRTMAP), and tropical terrestrial climates at the Last Glacial Maximum (the LGM Tropical Terrestrial Data Synthesis) are summarised. Each has been used to evaluate simulations of Last Glacial Maximum (LGM: 21 calendar kyr BP) and/or mid-Holocene (6 cal. kyr BP) environments. Comparisons have demonstrated that changes in radiative forcing and orography due to orbital and ice-sheet variations explain the first-order, broad-scale (in space and time) features of global climate change since the LGM. However, atmospheric models forced by 6 cal. kyr BP orbital changes with unchanged surface conditions fail to capture quantitative aspects of the observed climate, including the greatly increased magnitude and northward shift of the African monsoon during the early to mid-Holocene. Similarly, comparisons with palaeoenvironmental datasets show that atmospheric models have underestimated the magnitude of cooling and drying of much of the land surface at the LGM. The inclusion of feedbacks due to changes in ocean- and land-surface conditions at both times, and atmospheric dust loading at the LGM, appears to be required in order to produce a better simulation of these past climates. The development of Earth system models incorporating the dynamic interactions among ocean, atmosphere, and vegetation is therefore mandated by Quaternary science results as well as climatological principles. For greatest scientific benefit, this development must be paralleled by continued advances in palaeodata analysis and synthesis, which in turn will help to define questions that call for new focused data collection efforts.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Simulated intestinal fluids (SIFs) used to assay the solubility of orally administered drugs are typically based on a single bile salt; sodium taurocholate (STC). The aim of this study was to develop mimetic intestinal fluids with a closer similarity to physiological fluids than those reported to date by developing a mixed bile salt (MBS) system (STC, sodium glycodeoxycholate, sodium deoxycholate; 60:39:1) with different concentrations of lecithin, the preponderant intestinal phospholipid. Hydrocortisone and progesterone were used as model drugs to evaluate systematically the influence of SIF composition on solubility. Increasing total bile salt concentration from 0 to 30 mM increased hydrocortisone and progesterone solubility by 2- and ∼25-fold, respectively. Accordingly, higher solubilities were measured in the fed-state compared to the fasted-state SIFs. Progesterone showed the greatest increases in solubility in STC and MBS systems (2-7-fold) compared to hydrocortisone (no significant change; P>0.05) as lecithin concentration was increased. Overall, MBS systems gave similar solubility profiles to STC. In conclusion, the addenda of MBS and lecithin were found to be secondary to the influence of BS concentration. These data provide a foundation for the design of more bio-similar media for pivotal decision-guiding assays in drug development and quality control settings.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Skillful and timely streamflow forecasts are critically important to water managers and emergency protection services. To provide these forecasts, hydrologists must predict the behavior of complex coupled human–natural systems using incomplete and uncertain information and imperfect models. Moreover, operational predictions often integrate anecdotal information and unmodeled factors. Forecasting agencies face four key challenges: 1) making the most of available data, 2) making accurate predictions using models, 3) turning hydrometeorological forecasts into effective warnings, and 4) administering an operational service. Each challenge presents a variety of research opportunities, including the development of automated quality-control algorithms for the myriad of data used in operational streamflow forecasts, data assimilation, and ensemble forecasting techniques that allow for forecaster input, methods for using human-generated weather forecasts quantitatively, and quantification of human interference in the hydrologic cycle. Furthermore, much can be done to improve the communication of probabilistic forecasts and to design a forecasting paradigm that effectively combines increasingly sophisticated forecasting technology with subjective forecaster expertise. These areas are described in detail to share a real-world perspective and focus for ongoing research endeavors.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper presents a novel approach to the automatic classification of very large data sets composed of terahertz pulse transient signals, highlighting their potential use in biochemical, biomedical, pharmaceutical and security applications. Two different types of THz spectra are considered in the classification process. Firstly a binary classification study of poly-A and poly-C ribonucleic acid samples is performed. This is then contrasted with a difficult multi-class classification problem of spectra from six different powder samples that although have fairly indistinguishable features in the optical spectrum, they also possess a few discernable spectral features in the terahertz part of the spectrum. Classification is performed using a complex-valued extreme learning machine algorithm that takes into account features in both the amplitude as well as the phase of the recorded spectra. Classification speed and accuracy are contrasted with that achieved using a support vector machine classifier. The study systematically compares the classifier performance achieved after adopting different Gaussian kernels when separating amplitude and phase signatures. The two signatures are presented as feature vectors for both training and testing purposes. The study confirms the utility of complex-valued extreme learning machine algorithms for classification of the very large data sets generated with current terahertz imaging spectrometers. The classifier can take into consideration heterogeneous layers within an object as would be required within a tomographic setting and is sufficiently robust to detect patterns hidden inside noisy terahertz data sets. The proposed study opens up the opportunity for the establishment of complex-valued extreme learning machine algorithms as new chemometric tools that will assist the wider proliferation of terahertz sensing technology for chemical sensing, quality control, security screening and clinic diagnosis. Furthermore, the proposed algorithm should also be very useful in other applications requiring the classification of very large datasets.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Existing urban meteorological networks have an important role to play as test beds for inexpensive and more sustainable measurement techniques that are now becoming possible in our increasingly smart cities. The Birmingham Urban Climate Laboratory (BUCL) is a near-real-time, high-resolution urban meteorological network (UMN) of automatic weather stations and inexpensive, nonstandard air temperature sensors. The network has recently been implemented with an initial focus on monitoring urban heat, infrastructure, and health applications. A number of UMNs exist worldwide; however, BUCL is novel in its density, the low-cost nature of the sensors, and the use of proprietary Wi-Fi networks. This paper provides an overview of the logistical aspects of implementing a UMN test bed at such a density, including selecting appropriate urban sites; testing and calibrating low-cost, nonstandard equipment; implementing strict quality-assurance/quality-control mechanisms (including metadata); and utilizing preexisting Wi-Fi networks to transmit data. Also included are visualizations of data collected by the network, including data from the July 2013 U.K. heatwave as well as highlighting potential applications. The paper is an open invitation to use the facility as a test bed for evaluating models and/or other nonstandard observation techniques such as those generated via crowdsourcing techniques.