855 resultados para Six sigma (Quality control standard)
Resumo:
错边是评定激光拼焊质量的一个非常重要的指标,薄板构件错边的控制是激光拼焊中一个难题。针对国内首条全自动激光拼焊设备,对错边的产生以及控制方法进行了深入研究,经过大量试验确定了影响错边大小的几个主要因素,板材自身物理变形、压紧力大小与均匀性、压紧横梁变形、支撑底板平面度误差以及焊接变形的影响。通过分析以上因素对错边的影响以及各个因素之间相互关系,建立了错边预测的数学模型。试验验证了模型的正确性,从而为确定错边产生原因,提高焊接质量提供了一个有效的理论指导。
Resumo:
In the paper through extensive study and design, the technical plan for establishing the exploration database center is made to combine imported and self developed techniques. By research and repeated experiment a modern database center has been set up with its hardware and network having advanced performance, its system well configured, its data store and management complete, and its data support being fast and direct. Through study on the theory, method and model of decision an exploration decision assistant schema is designed with one decision plan of well location decision support system being evaluated and put into action. 1. Study on the establishment of Shengli exploration database center Research is made on the hardware configuration of the database center including its workstations and all connected hardware and system. The hardware of the database center is formed by connecting workstations, microcomputer workstations, disk arrays, and those equipments used for seismic processing and interpretation. Research on the data store and management includes the analysis of the contents to be managed, data flow, data standard, data QC, data backup and restore policy, optimization of database system. A reasonable data management regulation and workflow is made and the scientific exploration data management system is created. Data load is done by working out a schedule firstly and at last 200 more projects of seismic surveys has been loaded amount to 25TB. 2. Exploration work support system and its application Seismic data processing system support has the following features, automatic extraction of seismic attributes, GIS navigation, data order, extraction of any sized data cube, pseudo huge capacity disk array, standard output exchange format etc. The prestack data can be accessed by the processing system or data can be transferred to other processing system through standard exchange format. For supporting seismic interpretation system the following features exist such as auto scan and store of interpretation result, internal data quality control etc. the interpretation system is connected directly with database center to get real time support of seismic data, formation data and well data. Comprehensive geological study support is done through intranet with the ability to query or display data graphically on the navigation system under some geological constraints. Production management support system is mainly used to collect, analyze and display production data with its core technology on the controlled data collection and creation of multiple standard forms. 3. exploration decision support system design By classification of workflow and data flow of all the exploration stages and study on decision theory and method, target of each decision step, decision model and requirement, three concept models has been formed for the Shengli exploration decision support system including the exploration distribution support system, the well location support system and production management support system. the well location decision support system has passed evaluation and been put into action. 4. Technical advance Hardware and software match with high performance for the database center. By combining parallel computer system, database server, huge capacity ATL, disk array, network and firewall together to create the first exploration database center in China with reasonable configuration, high performance and able to manage the whole data sets of exploration. Huge exploration data management technology is formed where exploration data standards and management regulations are made to guarantee data quality, safety and security. Multifunction query and support system for comprehensive exploration information support. It includes support system for geological study, seismic processing and interpretation and production management. In the system a lot of new database and computer technology have been used to provide real time information support for exploration work. Finally is the design of Shengli exploration decision support system. 5. Application and benefit Data storage has reached the amount of 25TB with thousand of users in Shengli oil field to access data to improve work efficiency multiple times. The technology has also been applied by many other units of SINOPEC. Its application of providing data to a project named Exploration achievements and Evaluation of Favorable Targets in Hekou Area shortened the data preparation period from 30 days to 2 days, enriching data abundance 15 percent and getting information support from the database center perfectly. Its application to provide former processed result for a project named Pre-stack depth migration in Guxi fracture zone reduced the amount of repeated process and shortened work period of one month and improved processing precision and quality, saving capital investment of data processing of 30 million yuan. It application by providing project database automatically in project named Geological and seismic study of southern slope zone of Dongying Sag shortened data preparation time so that researchers have more time to do research, thus to improve interpretation precision and quality.
Resumo:
The extract of Adinandra nitida leaves, named as Shiyacha in China, was studied by high performance liquid chromatography (HPLC)-ultraviolet detection-electrospray ionisation (ESI) tandem mass spectrometry (MS). Under the optimized condition, the analysis could be finished in 45 min on a Hypersil C18 column combined with negative ion detection using information-dependent acquisition (IDA) mode of a Q TRAP (TM) instrument. Six flavonoids were identified as epicatechin, rhoifolin, apigenin, quercitrin, camellianin A, and camellianin B among which rhoifolin was for the first time found in Shiyacha. And the fragment pathways of these flavonoids were elucidated. Furthermore, with epicatechin, rhoifolin, and apigenin as markers, the quality control method for Shiyacha and its relevant product was firstly established. Calibration linearity was good (R-2 > 0.9992) over a three to four orders of magnitude concentration range with an S/N = 3 detection limit of 2.5 ng. (c) 2004 Elsevier B.V. All rights reserved.
Resumo:
Oxidized carbon nanotubes are tested as a matrix for analysis of small molecules by matrix assisted laser desorption/ionization time of flight mass spectrometry (MALDI-TOF-MS). Compared with nonoxidized carbon nanotubes, oxidized carbon nanotubes facilitate sample preparation because of their higher solubility in water. The matrix layer of oxidized carbon nanotubes is much more homogeneous and compact than that of nonoxidized carbon nanotubes. The efficiency of desorption/ionization for analytes and the reproducibility of peak intensities within and between sample spots are greatly enhanced on the surface of oxidized carbon nanotubes. The advantage of the oxidized carbon nanotubes in comparison with alpha-cyano-4-hydroxycinnamic acid (CCA) and carbon nanotubes is demonstrated by MALDI-TOF-MS analysis of an amino acid mixture. The matrix is successfully used for analysis of synthetic hydroxypropyl P-cyclodextrin, suggesting a great potential for monitoring reactions and for product quality control. Reliable quantitative analysis of jatrorrhizine and palmatine with a wide linear range (1-100 ng/mL) and good reproducibility of relative peak areas (RSD less than 10 %) is achieved using this matrix. Concentrations of jatrorrhizine (8.65 mg/mL) and palmatine (10.4 mg/mL) in an extract of Coptis chinensis Franch are determined simultaneously using the matrix and a standard addition method. (c) 2005 American Society for Mass Spectrometry.
Resumo:
L. Blot, A. Davis, M. Holubinka, R. Marti and R. Zwiggelaar, 'Automated quality assurance applied to mammographic imaging', EURASIP Journal of Applied Signal Processing 2002 (7), 736-745 (2002)
Resumo:
BACKGROUND: Serologic methods have been used widely to test for celiac disease and have gained importance in diagnostic definition and in new epidemiologic findings. However, there is no standardization, and there are no reference protocols and materials. METHODS: The European working group on Serological Screening for Celiac Disease has defined robust noncommercial test protocols for immunoglobulin (Ig)G and IgA gliadin antibodies and for IgA autoantibodies against endomysium and tissue transglutaminase. Standard curves were linear in the decisive range, and intra-assay variation coefficients were less than 5% to 10%. Calibration was performed with a group reference serum. Joint cutoff limits were used. Seven laboratories took part in the final collaborative study on 252 randomized sera classified by histology (103 pediatric and adult patients with active celiac disease, 89 disease control subjects, and 60 blood donors). RESULTS: IgA autoantibodies against endomysium and tissue transglutaminase rendered superior sensitivity (90% and 93%, respectively) and specificity (99% and 95%, respectively) over IgA and IgG gliadin antibodies. Tissue transglutaminase antibody testing showed superior receiver operating characteristic performance compared with gliadin antibodies. The K values for interlaboratory reproducibility showed superiority for IgA endomysium (0.93) in comparison with tissue transglutaminase antibodies (0.83) and gliadin antibodies (0.82 for IgG, 0.62 for IgA). CONCLUSIONS: Basic criteria of standardization and quality assessment must be fulfilled by any given test protocol proposed for serologic investigation of celiac disease. The working group has produced robust test protocols and reference materials available for standardization to further improve reliability of serologic testing for celiac disease.
Resumo:
The increasing complexity of new manufacturing processes and the continuously growing range of fabrication options mean that critical decisions about the insertion of new technologies must be made as early as possible in the design process. Mitigating the technology risks under limited knowledge is a key factor and major requirement to secure a successful development of the new technologies. In order to address this challenge, a risk mitigation methodology that incorporates both qualitative and quantitative analysis is required. This paper outlines the methodology being developed under a major UK grand challenge project - 3D-Mintegration. The main focus is on identifying the risks through identification of the product key characteristics using a product breakdown approach. The assessment of the identified risks uses quantification and prioritisation techniques to evaluate and rank the risks. Traditional statistical process control based on process capability and six sigma concepts are applied to measure the process capability as a result of the risks that have been identified. This paper also details a numerical approach that can be used to undertake risk analysis. This methodology is based on computational framework where modelling and statistical techniques are integrated. Also, an example of modeling and simulation technique is given using focused ion beam which is among the investigated in the project manufacturing processes.
Resumo:
A well documented, publicly available, global data set of surface ocean carbon dioxide (CO2) parameters has been called for by international groups for nearly two decades. The Surface Ocean CO2 Atlas (SOCAT) project was initiated by the international marine carbon science community in 2007 with the aim of providing a comprehensive, publicly available, regularly updated, global data set of marine surface CO2, which had been subject to quality control (QC). Many additional CO2 data, not yet made public via the Carbon Dioxide Information Analysis Center (CDIAC), were retrieved from data originators, public websites and other data centres. All data were put in a uniform format following a strict protocol. Quality control was carried out according to clearly defined criteria. Regional specialists performed the quality control, using state-of-the-art web-based tools, specially developed for accomplishing this global team effort. SOCAT version 1.5 was made public in September 2011 and holds 6.3 million quality controlled surface CO2 data points from the global oceans and coastal seas, spanning four decades (1968–2007). Three types of data products are available: individual cruise files, a merged complete data set and gridded products. With the rapid expansion of marine CO2 data collection and the importance of quantifying net global oceanic CO2 uptake and its changes, sustained data synthesis and data access are priorities.
Resumo:
A well-documented, publicly available, global data set of surface ocean carbon dioxide (CO2) parameters has been called for by international groups for nearly two decades. The Surface Ocean CO2 Atlas (SOCAT) project was initiated by the international marine carbon science community in 2007 with the aim of providing a comprehensive, publicly available, regularly updated, global data set of marine surface CO2, which had been subject to quality control (QC). Many additional CO2 data, not yet made public via the Carbon Dioxide Information Analysis Center (CDIAC), were retrieved from data originators, public websites and other data centres. All data were put in a uniform format following a strict protocol. Quality control was carried out according to clearly defined criteria. Regional specialists performed the quality control, using state-of-the-art web-based tools, specially developed for accomplishing this global team effort. SOCAT version 1.5 was made public in September 2011 and holds 6.3 million quality controlled surface CO2 data points from the global oceans and coastal seas, spanning four decades (1968–2007). Three types of data products are available: individual cruise files, a merged complete data set and gridded products. With the rapid expansion of marine CO2 data collection and the importance of quantifying net global oceanic CO2 uptake and its changes, sustained data synthesis and data access are priorities.
Resumo:
Raman spectroscopy has been used to predict the abundance of the FA in clarified butterfat that was obtained from dairy cows fed a range of levels of rapeseed oil in their diet. Partial least squares regression of the Raman spectra against FA compositions obtained by GC showed good prediction for the five major (abundance >5%) FA with R-2=0.74-0.92 and a root mean SE of prediction (RMSEP) that was 5-7% of the mean. In general, the prediction accuracy fell with decreasing abundance in the sample, but the RMSEP was 1.25%. The Raman method has the best prediction ability for unsaturated FA (R-2=0.85-0.92), and in particular trans unsaturated FA (best-predicted FA was 18:1 tDelta9). This enhancement was attributed to the isolation of the unsaturated modes from the saturated modes and the significantly higher spectral response of unsaturated bonds compared with saturated bonds. Raman spectra of the melted butter samples could also be used to predict bulk parameters calculated from standard analyzes, such as iodine value (R-2=0.80) and solid fat content at low temperature (R-2=0.87). For solid fat contents determined at higher temperatures, the prediction ability was significantly reduced (R-2=0.42), and this decrease in performance was attributed to the smaller range of values in solid fat content at the higher temperatures. Finally, although the prediction errors for the abundances of each of the FA in a given sample are much larger with Raman than with full GC analysis, the accuracy is acceptably high for quality control applications. This, combined with the fact that Raman spectra can be obtained with no sample preparation and with 60-s data collection times, means that high-throughput, on-line Raman analysis of butter samples should be possible.
Resumo:
A recent genome-wide association study reported association between schizophrenia and the ZNF804A gene on chromosome 2q32.1. We attempted to replicate these findings in our Irish Case-Control Study of Schizophrenia (ICCSS) sample (N=1021 cases, 626 controls). Following consultation with the original investigators, we genotyped three of the most promising single-nucleotide polymorphisms (SNPs) from the Cardiff study. We replicate association with rs1344706 (trend test one-tailed P=0.0113 with the previously associated A allele) in ZNF804A. We detect no evidence of association with rs6490121 in NOS1 (one-tailed P=0.21), and only a trend with rs9922369 in RGRIP1L (one-tailed P=0.0515). On the basis of these results, we completed genotyping of 11 additional linkage disequilibrium-tagging SNPs in ZNF804A. Of 12 SNPs genotyped, 11 pass quality control criteria and 4 are nominally associated, with our most significant evidence of association at rs7597593 (P=0.0013) followed by rs1344706. We observe no evidence of differential association in ZNF804A on the basis of family history or sex of case. The associated SNP rs1344706 lies in approximately 30 bp of conserved mammalian sequence, and the associated A allele is predicted to maintain binding sites for the brain-expressed transcription factors MYT1l and POU3F1/OCT-6. In controls, expression is significantly increased from the A allele of rs1344706 compared with the C allele. Expression is increased in schizophrenic cases compared with controls, but this difference does not achieve statistical significance. This study replicates the original reported association of ZNF804A with schizophrenia and suggests that there is a consistent link between the A allele of rs1344706, increased expression of ZNF804A and risk for schizophrenia.
Resumo:
BACKGROUND: The genetic basis for developing asthma has been extensively studied. However, association studies to date have mostly focused on mild to moderate disease and genetic risk factors for severe asthma remain unclear. OBJECTIVE: To identify common genetic variants affecting susceptibility to severe asthma. METHODS: A genome-wide association study was undertaken in 933 European ancestry individuals with severe asthma based on Global Initiative for Asthma (GINA) criteria 3 or above and 3346 clean controls. After standard quality control measures, the association of 480?889 genotyped single nucleotide polymorphisms (SNPs) was tested. To improve the resolution of the association signals identified, non-genotyped SNPs were imputed in these regions using a dense reference panel of SNP genotypes from the 1000 Genomes Project. Then replication of SNPs of interest was undertaken in a further 231 cases and 1345 controls and a meta-analysis was performed to combine the results across studies. RESULTS: An association was confirmed in subjects with severe asthma of loci previously identified for association with mild to moderate asthma. The strongest evidence was seen for the ORMDL3/GSDMB locus on chromosome 17q12-21 (rs4794820, p=1.03×10((-8)) following meta-analysis) meeting genome-wide significance. Strong evidence was also found for the IL1RL1/IL18R1 locus on 2q12 (rs9807989, p=5.59×10((-8)) following meta-analysis) just below this threshold. No novel loci for susceptibility to severe asthma met strict criteria for genome-wide significance. CONCLUSIONS: The largest genome-wide association study of severe asthma to date was carried out and strong evidence found for the association of two previously identified asthma susceptibility loci in patients with severe disease. A number of novel regions with suggestive evidence were also identified warranting further study.
Resumo:
NuGO, the European Nutrigenomics Organization, utilizes 31 powerful computers for, e.g., data storage and analysis. These so-called black boxes (NBXses) are located at the sites of different partners. NuGO decided to use GenePattern as the preferred genomic analysis tool on each NBX. To handle the custom made Affymetrix NuGO arrays, new NuGO modules are added to GenePattern. These NuGO modules execute the latest Bioconductor version ensuring up-to-date annotations and access to the latest scientific developments. The following GenePattern modules are provided by NuGO: NuGOArrayQualityAnalysis for comprehensive quality control, NuGOExpressionFileCreator for import and normalization of data, LimmaAnalysis for identification of differentially expressed genes, TopGoAnalysis for calculation of GO enrichment, and GetResultForGo for retrieval of information on genes associated with specific GO terms. All together, these NuGO modules allow comprehensive, up-to-date, and user friendly analysis of Affymetrix data. A special feature of the NuGO modules is that for analysis they allow the use of either the standard Affymetrix or the MBNI custom CDF-files, which remap probes based on current knowledge. In both cases a .chip-file is created to enable GSEA analysis. The NuGO GenePattern installations are distributed as binary Ubuntu (.deb) packages via the NuGO repository.
Resumo:
The commonly used British Standard constant head triaxial permeability (BS) test, for permeability testing of fine grained soils, is known to have a relatively long test duration. Consequently, a reduction in the required time for permeability test provides potential cost savings, to the construction industry (specifically, for use during Construction Quality Control (CQA) of landfill mineral liners). The purpose of this article is to investigate and evaluate alternative short duration testing methods for the measurement of the permeability of fine grained soils.
As part of the investigation the feasibility of an existing method of short duration permeability test, known as the Accelerated Permeability (AP) test was assessed and compared with permeability measured using British Standard method (BS) and Ramp Accelerated Permeability (RAP). Four different fine grained materials, of a variety of physical properties were compacted at various moisture contents to produced analogous samples for testing using three the three different methodologies. Fabric analysis was carried out on specimens derived from post-test samples using Mercury Intrusion Porosimetry (MIP) and Scanning Electron Microscope (SEM) to assess the effects of testing methodology on soil structure. Results showed that AP testing in general under predicts permeability values derived from the BS test due to large changes in structure of the soil caused by AP test methodology, which is also validated using MIP and SEM observations. RAP testing, in general provides an improvement to the AP test but still under-predicts permeability values. The potential savings in test duration are shown to be relatively minimal for both the AP and RAP tests.