946 resultados para weld quality control
Resumo:
A recently developed capillary electrophoresis (CE)-negative-ionisation mass spectrometry (MS) method was used to profile anionic metabolites in a microbial-host co-metabolism study. Urine samples from rats receiving antibiotics (penicillin G and streptomycin sulfate) for 0, 4, or 8 days were analysed. A quality control sample was measured repeatedly to monitor the performance of the applied CE-MS method. After peak alignment, relative standard deviations (RSDs) for migration time of five representative compounds were below 0.4 %, whereas RSDs for peak area were 7.9–13.5 %. Using univariate and principal component analysis of obtained urinary metabolic profiles, groups of rats receiving different antibiotic treatment could be distinguished based on 17 discriminatory compounds, of which 15 were downregulated and 2 were upregulated upon treatment. Eleven compounds remained down- or upregulated after discontinuation of the antibiotics administration, whereas a recovery effect was observed for others. Based on accurate mass, nine compounds were putatively identified; these included the microbial-mammalian co-metabolites hippuric acid and indoxyl sulfate. Some discriminatory compounds were also observed by other analytical techniques, but CE-MS uniquely revealed ten metabolites modulated by antibiotic exposure, including aconitic acid and an oxocholic acid. This clearly demonstrates the added value of CE-MS for nontargeted profiling of small anionic metabolites in biological samples.
Resumo:
Anthropogenic emissions of heat and exhaust gases play an important role in the atmospheric boundary layer, altering air quality, greenhouse gas concentrations and the transport of heat and moisture at various scales. This is particularly evident in urban areas where emission sources are integrated in the highly heterogeneous urban canopy layer and directly linked to human activities which exhibit significant temporal variability. It is common practice to use eddy covariance observations to estimate turbulent surface fluxes of latent heat, sensible heat and carbon dioxide, which can be attributed to a local scale source area. This study provides a method to assess the influence of micro-scale anthropogenic emissions on heat, moisture and carbon dioxide exchange in a highly urbanized environment for two sites in central London, UK. A new algorithm for the Identification of Micro-scale Anthropogenic Sources (IMAS) is presented, with two aims. Firstly, IMAS filters out the influence of micro-scale emissions and allows for the analysis of the turbulent fluxes representative of the local scale source area. Secondly, it is used to give a first order estimate of anthropogenic heat flux and carbon dioxide flux representative of the building scale. The algorithm is evaluated using directional and temporal analysis. The algorithm is then used at a second site which was not incorporated in its development. The spatial and temporal local scale patterns, as well as micro-scale fluxes, appear physically reasonable and can be incorporated in the analysis of long-term eddy covariance measurements at the sites in central London. In addition to the new IMAS-technique, further steps in quality control and quality assurance used for the flux processing are presented. The methods and results have implications for urban flux measurements in dense urbanised settings with significant sources of heat and greenhouse gases.
Resumo:
We describe the approach to be adopted for a major new initiative to derive a homogeneous record of sea surface temperature for 1991–2007 from the observations of the series of three along-track scanning radiometers (ATSRs). This initiative is called (A)RC: (Advanced) ATSR Re-analysis for Climate. The main objectives are to reduce regional biases in retrieved sea surface temperature (SST) to less than 0.1 K for all global oceans, while creating a very homogenous record that is stable in time to within 0.05 K decade−1, with maximum independence of the record from existing analyses of SST used in climate change research. If these stringent targets are achieved, this record will enable significantly improved estimates of surface temperature trends and variability of sufficient quality to advance questions of climate change attribution, climate sensitivity and historical reconstruction of surface temperature changes. The approach includes development of new, consistent estimators for SST for each of the ATSRs, and detailed analysis of overlap periods. Novel aspects of the approach include generation of multiple versions of the record using alternative channel sets and cloud detection techniques, to assess for the first time the effect of such choices. There will be extensive effort in quality control, validation and analysis of the impact on climate SST data sets. Evidence for the plausibility of the 0.1 K target for systematic error is reviewed, as is the need for alternative cloud screening methods in this context.
Resumo:
With the growing number and significance of urban meteorological networks (UMNs) across the world, it is becoming critical to establish a standard metadata protocol. Indeed, a review of existing UMNs indicate large variations in the quality, quantity, and availability of metadata containing technical information (i.e., equipment, communication methods) and network practices (i.e., quality assurance/quality control and data management procedures). Without such metadata, the utility of UMNs is greatly compromised. There is a need to bring together the currently disparate sets of guidelines to ensure informed and well-documented future deployments. This should significantly improve the quality, and therefore the applicability, of the high-resolution data available from such networks. Here, the first metadata protocol for UMNs is proposed, drawing on current recommendations for urban climate stations and identified best practice in existing networks
Resumo:
BACKGROUND: Single nucleotide polymorphisms (SNPs) in genes encoding the components involved in the hypothalamic pathway may influence weight gain and dietary factors may modify their effects. AIM: We conducted a case-cohort study to investigate the associations of SNPs in candidate genes with weight change during an average of 6.8 years of follow-up and to examine the potential effect modification by glycemic index (GI) and protein intake. METHODS AND FINDINGS: Participants, aged 20-60 years at baseline, came from five European countries. Cases ('weight gainers') were selected from the total eligible cohort (n = 50,293) as those with the greatest unexplained annual weight gain (n = 5,584). A random subcohort (n = 6,566) was drawn with the intention to obtain an equal number of cases and noncases (n = 5,507). We genotyped 134 SNPs that captured all common genetic variation across the 15 candidate genes; 123 met the quality control criteria. Each SNP was tested for association with the risk of being a 'weight gainer' (logistic regression models) in the case-noncase data and with weight gain (linear regression models) in the random subcohort data. After accounting for multiple testing, none of the SNPs was significantly associated with weight change. Furthermore, we observed no significant effect modification by dietary factors, except for SNP rs7180849 in the neuromedin β gene (NMB). Carriers of the minor allele had a more pronounced weight gain at a higher GI (P = 2 x 10⁻⁷). CONCLUSIONS: We found no evidence of association between SNPs in the studied hypothalamic genes with weight change. The interaction between GI and NMB SNP rs7180849 needs further confirmation.
Resumo:
Global syntheses of palaeoenvironmental data are required to test climate models under conditions different from the present. Data sets for this purpose contain data from spatially extensive networks of sites. The data are either directly comparable to model output or readily interpretable in terms of modelled climate variables. Data sets must contain sufficient documentation to distinguish between raw (primary) and interpreted (secondary, tertiary) data, to evaluate the assumptions involved in interpretation of the data, to exercise quality control, and to select data appropriate for specific goals. Four data bases for the Late Quaternary, documenting changes in lake levels since 30 kyr BP (the Global Lake Status Data Base), vegetation distribution at 18 kyr and 6 kyr BP (BIOME 6000), aeolian accumulation rates during the last glacial-interglacial cycle (DIRTMAP), and tropical terrestrial climates at the Last Glacial Maximum (the LGM Tropical Terrestrial Data Synthesis) are summarised. Each has been used to evaluate simulations of Last Glacial Maximum (LGM: 21 calendar kyr BP) and/or mid-Holocene (6 cal. kyr BP) environments. Comparisons have demonstrated that changes in radiative forcing and orography due to orbital and ice-sheet variations explain the first-order, broad-scale (in space and time) features of global climate change since the LGM. However, atmospheric models forced by 6 cal. kyr BP orbital changes with unchanged surface conditions fail to capture quantitative aspects of the observed climate, including the greatly increased magnitude and northward shift of the African monsoon during the early to mid-Holocene. Similarly, comparisons with palaeoenvironmental datasets show that atmospheric models have underestimated the magnitude of cooling and drying of much of the land surface at the LGM. The inclusion of feedbacks due to changes in ocean- and land-surface conditions at both times, and atmospheric dust loading at the LGM, appears to be required in order to produce a better simulation of these past climates. The development of Earth system models incorporating the dynamic interactions among ocean, atmosphere, and vegetation is therefore mandated by Quaternary science results as well as climatological principles. For greatest scientific benefit, this development must be paralleled by continued advances in palaeodata analysis and synthesis, which in turn will help to define questions that call for new focused data collection efforts.
Resumo:
Simulated intestinal fluids (SIFs) used to assay the solubility of orally administered drugs are typically based on a single bile salt; sodium taurocholate (STC). The aim of this study was to develop mimetic intestinal fluids with a closer similarity to physiological fluids than those reported to date by developing a mixed bile salt (MBS) system (STC, sodium glycodeoxycholate, sodium deoxycholate; 60:39:1) with different concentrations of lecithin, the preponderant intestinal phospholipid. Hydrocortisone and progesterone were used as model drugs to evaluate systematically the influence of SIF composition on solubility. Increasing total bile salt concentration from 0 to 30 mM increased hydrocortisone and progesterone solubility by 2- and ∼25-fold, respectively. Accordingly, higher solubilities were measured in the fed-state compared to the fasted-state SIFs. Progesterone showed the greatest increases in solubility in STC and MBS systems (2-7-fold) compared to hydrocortisone (no significant change; P>0.05) as lecithin concentration was increased. Overall, MBS systems gave similar solubility profiles to STC. In conclusion, the addenda of MBS and lecithin were found to be secondary to the influence of BS concentration. These data provide a foundation for the design of more bio-similar media for pivotal decision-guiding assays in drug development and quality control settings.
Resumo:
Skillful and timely streamflow forecasts are critically important to water managers and emergency protection services. To provide these forecasts, hydrologists must predict the behavior of complex coupled human–natural systems using incomplete and uncertain information and imperfect models. Moreover, operational predictions often integrate anecdotal information and unmodeled factors. Forecasting agencies face four key challenges: 1) making the most of available data, 2) making accurate predictions using models, 3) turning hydrometeorological forecasts into effective warnings, and 4) administering an operational service. Each challenge presents a variety of research opportunities, including the development of automated quality-control algorithms for the myriad of data used in operational streamflow forecasts, data assimilation, and ensemble forecasting techniques that allow for forecaster input, methods for using human-generated weather forecasts quantitatively, and quantification of human interference in the hydrologic cycle. Furthermore, much can be done to improve the communication of probabilistic forecasts and to design a forecasting paradigm that effectively combines increasingly sophisticated forecasting technology with subjective forecaster expertise. These areas are described in detail to share a real-world perspective and focus for ongoing research endeavors.
Resumo:
This paper presents a novel approach to the automatic classification of very large data sets composed of terahertz pulse transient signals, highlighting their potential use in biochemical, biomedical, pharmaceutical and security applications. Two different types of THz spectra are considered in the classification process. Firstly a binary classification study of poly-A and poly-C ribonucleic acid samples is performed. This is then contrasted with a difficult multi-class classification problem of spectra from six different powder samples that although have fairly indistinguishable features in the optical spectrum, they also possess a few discernable spectral features in the terahertz part of the spectrum. Classification is performed using a complex-valued extreme learning machine algorithm that takes into account features in both the amplitude as well as the phase of the recorded spectra. Classification speed and accuracy are contrasted with that achieved using a support vector machine classifier. The study systematically compares the classifier performance achieved after adopting different Gaussian kernels when separating amplitude and phase signatures. The two signatures are presented as feature vectors for both training and testing purposes. The study confirms the utility of complex-valued extreme learning machine algorithms for classification of the very large data sets generated with current terahertz imaging spectrometers. The classifier can take into consideration heterogeneous layers within an object as would be required within a tomographic setting and is sufficiently robust to detect patterns hidden inside noisy terahertz data sets. The proposed study opens up the opportunity for the establishment of complex-valued extreme learning machine algorithms as new chemometric tools that will assist the wider proliferation of terahertz sensing technology for chemical sensing, quality control, security screening and clinic diagnosis. Furthermore, the proposed algorithm should also be very useful in other applications requiring the classification of very large datasets.
Resumo:
Existing urban meteorological networks have an important role to play as test beds for inexpensive and more sustainable measurement techniques that are now becoming possible in our increasingly smart cities. The Birmingham Urban Climate Laboratory (BUCL) is a near-real-time, high-resolution urban meteorological network (UMN) of automatic weather stations and inexpensive, nonstandard air temperature sensors. The network has recently been implemented with an initial focus on monitoring urban heat, infrastructure, and health applications. A number of UMNs exist worldwide; however, BUCL is novel in its density, the low-cost nature of the sensors, and the use of proprietary Wi-Fi networks. This paper provides an overview of the logistical aspects of implementing a UMN test bed at such a density, including selecting appropriate urban sites; testing and calibrating low-cost, nonstandard equipment; implementing strict quality-assurance/quality-control mechanisms (including metadata); and utilizing preexisting Wi-Fi networks to transmit data. Also included are visualizations of data collected by the network, including data from the July 2013 U.K. heatwave as well as highlighting potential applications. The paper is an open invitation to use the facility as a test bed for evaluating models and/or other nonstandard observation techniques such as those generated via crowdsourcing techniques.
Resumo:
A method for the determination of volatile organic compounds (VOCs) in recycled polyethylene terephthalate and high-density polyethylene using headspace sampling by solid-phase microextraction and gas chromatography coupled to mass spectrometry detection is presented. This method was used to evaluate the efficiency of cleaning processes for VOC removal from recycled PET. In addition, the method was also employed to evaluate the level of VOC contamination in multilayer packaging material containing recycled HDPE material. The optimisation of the extraction procedure for volatile compounds was performed and the best extraction conditions were found using a 75 mu m carboxen-polydimethylsiloxane (CAR-PDMS) fibre for 20 min at 60 degrees C. The validation parameters for the established method were linear range, linearity, sensitivity, precision (repeatability), accuracy (recovery) and detection and quantification limits. The results indicated that the method could easily be used in quality control for the production of recycled PET and HDPE. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
Objective: To investigate whether submicroscopic copy number variants (CNVs) on the X chromosome can be identified in women with primary ovarian insufficiency (POI), defined as spontaneous secondary amenorrhea before 40 years of age accompanied by follicle-stimulating hormone levels above 40 IU/L on at least two occasions. Design: Analysis of intensity data of single nucleotide polymorphism (SNP) probes generated by genomewide Illumina 370k CNV BeadChips, followed by the validation of identified loci using a custom designed ultra-high-density comparative genomic hybridization array containing 48,325 probes evenly distributed over the X chromosome. Setting: Multicenter genetic cohort study in the Netherlands. Patient(s): 108 Dutch Caucasian women with POI, 97 of whom passed quality control, who had a normal karyogram and absent fragile X premutation, and 235 healthy Dutch Caucasian women as controls. Intervention(s): None. Main Outcome Measure(s): Amount and locus of X chromosomal microdeletions or duplications. Result(s): Intensity differences between SNP probes identify microdeletions and duplications. The initial analysis identified an overrepresentation of deletions in POI patients. Moreover, CNVs in two genes on the Xq21.3 locus (i.e., PCDH11X and TGIF2LX) were statistically significantly associated with the POI phenotype. Mean size of identified CNVs was 262 kb. However, in the validation study the identified putative Xq21.3 deletions samples did not show deviations in intensities in consecutive probes. Conclusion(s): X chromosomal submicroscopic CNVs do not play a major role in Caucasian POI patients. We provide guidelines on how submicroscopic cytogenetic POI research should be conducted. (Fertil Steril (R) 2011;95:1584-8. (C) 2011 by American Society for Reproductive Medicine.)
Resumo:
In eukaryotes, pre-rRNA processing depends on a large number of nonribosomal trans-acting factors that form intriguingly organized complexes. One of the early stages of pre-rRNA processing includes formation of the two intermediate complexes pre-40S and pre-60S, which then form the mature ribosome subunits. Each of these complexes contains specific pre-rRNAs, ribosomal proteins and processing factors. The yeast nucleolar protein Nop53p has previously been identified in the pre-60S complex and shown to affect pre-rRNA processing by directly binding to 5.8S rRNA, and to interact with Nop17p and Nip7p, which are also involved in this process. Here we show that Nop53p binds 5.8S rRNA co-transcriptionally through its N-terminal region, and that this protein portion can also partially complement growth of the conditional mutant strain Delta nop53/GAL:NOP53. Nop53p interacts with Rrp6p and activates the exosome in vitro. These results indicate that Nop53p may recruit the exosome to 7S pre-rRNA for processing. Consistent with this observation and similar to the observed in exosome mutants, depletion of Nop53p leads to accumulation of polyadenylated pre-rRNAs.
Resumo:
Considering the Wald, score, and likelihood ratio asymptotic test statistics, we analyze a multivariate null intercept errors-in-variables regression model, where the explanatory and the response variables are subject to measurement errors, and a possible structure of dependency between the measurements taken within the same individual are incorporated, representing a longitudinal structure. This model was proposed by Aoki et al. (2003b) and analyzed under the bayesian approach. In this article, considering the classical approach, we analyze asymptotic test statistics and present a simulation study to compare the behavior of the three test statistics for different sample sizes, parameter values and nominal levels of the test. Also, closed form expressions for the score function and the Fisher information matrix are presented. We consider two real numerical illustrations, the odontological data set from Hadgu and Koch (1999), and a quality control data set.
Resumo:
The quality control optimization of medical processes that use ionizing radiation in the treatment of diseases like cancer is a key element for patient safety and success of treatment. The major medical application of radiation is radiotherapy, i.e. the delivery of dose levels to well-defined target tissues of a patient with the purpose of eliminating a disease. The need of an accurate tumour-edge definition with the purpose of preserving healthy surrounding tissue demands rigorous radiation treatment planning. Dosimetric methods are used for dose distribution mapping region of interest to assure that the prescribed dose and the irradiated region are correct. The Fricke gel (FXG) is the main dosimeter that supplies visualization of the three-dimensional (3D) dose distribution. In this work the dosimetric characteristics of the modified Fricke dosimeter produced at the Radiation Metrology Centre of the Institute of Energetic and Nuclear Research (IPEN) such as gel concentration dose response dependence, xylenol orange addition influence, dose response between 5 and 50Gy and signal stability were evaluated by magnetic resonance imaging (MRI). Using the same gel solution, breast simulators (phantoms) were shaped and absorbed dose distributions were imaged by MRI at the Nuclear Resonance Laboratory of the Physics Institute of Sao Paulo University. (C) 2007 Elsevier Ltd. All rights reserved.