963 resultados para Six sigma (Quality control standard)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Medical microbiology and virology laboratories use nucleic acid tests (NAT) to detect genomic material of infectious organisms in clinical samples. Laboratories choose to perform assembled (or in-house) NAT if commercial assays are not available or if assembled NAT are more economical or accurate. One reason commercial assays are more expensive is because extensive validation is necessary before the kit is marketed, as manufacturers must accept liability for the performance of their assays, assuming their instructions are followed. On the other hand, it is a particular laboratory's responsibility to validate an assembled NAT prior to using it for testing and reporting results on human samples. There are few published guidelines for the validation of assembled NAT. One procedure that laboratories can use to establish a validation process for an assay is detailed in this document. Before validating a method, laboratories must optimise it and then document the protocol. All instruments must be calibrated and maintained throughout the testing process. The validation process involves a series of steps including: (i) testing of dilution series of positive samples to determine the limits of detection of the assay and their linearity over concentrations to be measured in quantitative NAT; (ii) establishing the day-to-day variation of the assay's performance; (iii) evaluating the sensitivity and specificity of the assay as far as practicable, along with the extent of cross-reactivity with other genomic material; and (iv) assuring the quality of assembled assays using quality control procedures that monitor the performance of reagent batches before introducing new lots of reagent for testing.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We examined the feasibility of a low-cost, store-and-forward teledermatology service for general practitioners (GPs) in regional Queensland. Digital pictures and a brief case history were transmitted by email. A service coordinator carried out quality control checks and then forwarded these email messages to a consultant dermatologist. On receiving a clinical response from the dermatologist, the service coordinator returned the message to the referring GP. The aim was to provide advice to rural Gps within one working day. Over six months, 63 referrals were processed by the teledermatology service, covering a wide range of dermatological conditions. In the majority of cases the referring doctors were able to treat the condition after receipt of email advice from the dermatologist; however, in 10 cases (16%) additional images or biopsy results were requested because image quality was inadequate. The average time between a referral being received and clinical advice being provided to the referring GPs was 46 hours. The number of referrals in the present study, 1.05 per month per site, was similar to that reported in other primary care studies. While the use of low-cost digital cameras and public email is feasible, there may be other issues, for example remuneration, which will militate against the widespread introduction of primary care teledermatology in Australia.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objectives: Cyclosporin is an immunosuppressant drug with a narrow therapeutic window. Trough and 2-h post-dose blood samples are currently used for therapeutic drug monitoring in solid organ transplant recipients. The aim of the current study was to develop a rapid HPLC-tandem mass spectrometry (HPLC-MS) method for the measurement of cyclosporin in whole blood that was not only suitable for the clinical setting but also considered a reference method. Methods: Blood samples (50 mu L) were prepared by protein precipitation followed by C-18 solid-phase extraction while using d(12) cyclosporin as the internal standard. Mass spectrometric detection was by selected reaction monitoring with an electrospray interface in positive ionization mode. Results: The assay was linear from 10 to 2000 mu g/L (r(2) > 0.996, n = 9). Inter-day,analytical recovery and imprecision using whole blood quality control samples at 10, 30, 400, 1500, and 2000 mu g/L were 94.9-103.5% and

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We report here a validated method for the quantification of a new immunosuppressant drug FTY720, using HPLC-tandem mass spectrometry. Whole blood samples (500 mu l) were subjected to liquid-liquid extraction, in the presence of an internal standard (Y-32919). Mass spectrometric detection was by selected reaction monitoring with an atmospheric pressure chemical ionization source in positive ionization mode (FTY720: m/z 308.3 -> 255.3). The assay was linear from 0.2 to 25 mu g/l (r(2) > 0.997, n = 5). The inter- and intra-day analytical recovery and imprecision for quality control samples (0.5, 7 and 15 mu g/l) were 95.8-103.2 and < 5.5%, respectively. At the lower limit of quantification (0.2 mu g/l) the interand intra-day analytical recovery was 99.0-102.8% with imprecision of < 7.6% (n = 5). The assay had a mean relative recovery of 100.5 +/- 5.8% (n = 15). Extracted samples were stable for 16 h. IFTY720 quality control samples were stable at room temperature for 16 h at 4 degrees C for at least 8 days and when taken through at least three freeze-thaw cycles. In conclusion, the method described displays analytical performance characteristics that are suitable for pharmacokinetic studies in humans. (c) 2006 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Therapeutic monitoring with dosage individualization of sirolimus drug therapy is standard clinical practice for organ transplant recipients. For several years sirolimus monitoring has been restricted as a result of lack of an immunoassay. The recent reintroduction of the microparticle enzyme immunoassay (MEIA (R)) for sirolimus on the IMx (R) analyser has the potential to address this situation. This Study, using patient samples, has compared the MEIA (R) sirolimus method with an established HPLC-tandem mass spectrometry method (HPLC-MS/MS). An established HPLC-UV assay was used for independent cross-validation. For quality control materials (5, 11, 22 mu g/L), the MEIA (R) showed acceptable validation criteria based on intra-and inter-run precision (CV) and accuracy (bias) of < 8% and < 13%, respectively. The lower limit of quantitation was found to be approximately 3 mu g/L. The performance of the immunoassay was compared with HPLC-MS/MS using EDTA whole-blood samples obtained from various types of organ transplant recipients (n = 116). The resultant Deming regression line was: MEIA = 1.3 x HPLC-MS/MS+ 1.3 (r = 0.967, s(y/x) = 1) with a mean bias of 49.2% +/- 23.1 % (range, -2.4% to 128%; P < 0.001). The reason for the large and variable bias was not explored in this study, but the sirolimus-metabolite cross-reactivity with the MEIA (R) antibody could be a substantive contributing factor. Whereas the MEIA (R) sirolimus method may be an adjunct to sirolimus dosage individualization in transplant recipients, users must consider the implications of the substantial and variable bias when interpreting results. In selected patients where difficult clinical issues arise, reference to a specific chromatographic method may be required.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background and purpose Survey data quality is a combination of the representativeness of the sample, the accuracy and precision of measurements, data processing and management with several subcomponents in each. The purpose of this paper is to show how, in the final risk factor surveys of the WHO MONICA Project, information on data quality were obtained, quantified, and used in the analysis. Methods and results In the WHO MONICA (Multinational MONItoring of trends and determinants in CArdiovascular disease) Project, the information about the data quality components was documented in retrospective quality assessment reports. On the basis of the documented information and the survey data, the quality of each data component was assessed and summarized using quality scores. The quality scores were used in sensitivity testing of the results both by excluding populations with low quality scores and by weighting the data by its quality scores. Conclusions Detailed documentation of all survey procedures with standardized protocols, training, and quality control are steps towards optimizing data quality. Quantifying data quality is a further step. Methods used in the WHO MONICA Project could be adopted to improve quality in other health surveys.

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With the advent of globalisation companies all around the world must improve their performance in order to survive. The threats are coming from everywhere, and in different ways, such as low cost products, high quality products, new technologies, and new products. Different companies in different countries are using various techniques and using quality criteria items to strive for excellence. Continuous improvement techniques are used to enable companies to improve their operations. Therefore, companies are using techniques such as TQM, Kaizen, Six-Sigma, Lean Manufacturing, and quality award criteria items such as Customer Focus, Human Resources, Information & Analysis, and Process Management. The purpose of this paper is to compare the use of these techniques and criteria items in two countries, Mexico and the United Kingdom, which differ in culture and industrial structure. In terms of the use of continuous improvement tools and techniques, Mexico formally started to deal with continuous improvement by creating its National Quality Award soon after the Americans began the Malcolm Baldrige National Quality Award. The United Kingdom formally started by using the European Quality Award (EQA), modified and renamed as the EFQM Excellence Model. The methodology used in this study was to undertake a literature review of the subject matter and to study some general applications around the world. A questionnaire survey was then designed and a survey undertaken based on the same scale, about the same sample size, and the about the same industrial sector within the two countries. The survey presents a brief definition of each of the constructs to facilitate understanding of the questions. The analysis of the data was then conducted with the assistance of a statistical software package. The survey results indicate both similarities and differences in the strengths and weaknesses of the companies in the two countries. One outcome of the analysis is that it enables the companies to use the results to benchmark themselves and thus act to reinforce their strengths and to reduce their weaknesses.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This Thesis reports on the principles and usefulness of Performance Rating as developed by the writer over a number of years. In Part one a brief analysis is made of the Quality scene and its development up to the present. The need is exposed for Performance Rating as a tool for all areas of management*. At the same time a system of Quality Control is described which the writer has further developed under the title of 'Operator Control'. This system is based on the integration of all Quality control functions with the creative functions required for Quality achievement. The discussions are mainly focussed on the general philosophy of Quality, its creation and control and that part of Operator Control which affects Performance Rating. Whereas it is shown that the combination of Operator Control and Performance Rating is both economically and technically advantageous, Performance Rating can also usefully be applied under inspection control conditions. Part two describes the principles of Area Performance Rating. *The need for, and the advantages of, Performance Rating are particularly demonstrated in Case study No.1. From this a summation expression is derived which gives the key for grouping of areas with similar Performance Rating (P). A model is devised on which the theory is demonstrated. Relevant case studies, carried out in practice in factories are quoted in Part two, Chapter 4, one written by the Quality manager of that particular factory. Particular stress is laid in the final conclusions on management's function in the Quality field and how greatly this function is eased and improved through the introduction of Area Performance Rating.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Previously, specifications for mechanical properties of casting alloys were based on separately cast test bars. This practice provided consistently reproducible results; thus, any change in conditions was reflected in changes in the mechanical properties of the test coupons. These test specimens, however, did not necessarily reflect the actual mechanical properties of the castings they were supposed to represent'. Factors such as section thickness and casting configuration affect the solidification rate and soundness of the casting thereby raising or lowering its mechanical properties in comparison with separately cast test specimens. In the work now reported, casting shapes were developed to investigate the variations of section thickness, chemical analysis and heat treatment on the mechanical properties of a high strength Aluminium alloy under varying chilling conditions. In addition, an insight was sought into the behaviour of chills under more practical conditions. Finally, it was demonstrated that additional information could be derived from the radiographs which form an essential part of the quality control of premium quality castings. As a result of the work, it is now possible to select analysis and chilling conditions to optimize the as cast and the heat treated mechanical properties of Aluminum 7% Silicon 0.3% Magnesium alloy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background - Delivery of high-quality, evidence-based health care to deprived sectors of the community is a major goal for society. We investigated the effectiveness of a culturally sensitive, enhanced care package in UK general practices for improvement of cardiovascular risk factors in patients of south Asian origin with type 2 diabetes. Methods - In this cluster randomised controlled trial, 21 inner-city practices in the UK were assigned by simple randomisation to intervention (enhanced care including additional time with practice nurse and support from a link worker and diabetes-specialist nurse [nine practices; n=868]) or control (standard care [12 practices; n=618]) groups. All adult patients of south Asian origin with type 2 diabetes were eligible. Prescribing algorithms with clearly defined targets were provided for all practices. Primary outcomes were changes in blood pressure, total cholesterol, and glycaemic control (haemoglobin A1c) after 2 years. Analysis was by intention to treat. This trial is registered, number ISRCTN 38297969. Findings - We recorded significant differences between treatment groups in diastolic blood pressure (1·91 [95% CI -2·88 to -0·94] mm?Hg, p=0·0001) and mean arterial pressure (1·36 [-2·49 to -0·23] mm?Hg, p=0·0180), after adjustment for confounders and clustering. We noted no significant differences between groups for total cholesterol (0·03 [-0·04 to 0·11] mmol/L), systolic blood pressure (-0·33 [-2·41 to 1·75] mm?Hg), or HbA1c (-0·15% [-0·33 to 0·03]). Economic analysis suggests that the nurse-led intervention was not cost effective (incremental cost-effectiveness ratio £28?933 per QALY gained). Across the whole study population over the 2 years of the trial, systolic blood pressure, diastolic blood pressure, and cholesterol decreased significantly by 4·9 (95% CI 4·0–5·9) mm?Hg, 3·8 (3·2–4·4) mm?Hg, and 0·45 (0·40–0·51) mmol/L, respectively, and we recorded a small and non-significant increase for haemoglobin A1c (0·04% [-0·04 to 0·13]), p=0·290). Interpretation - We recorded additional, although small, benefits from our culturally tailored care package that were greater than the secular changes achieved in the UK in recent years. Stricter targets in general practice and further measures to motivate patients are needed to achieve best possible health-care outcomes in south Asian patients with diabetes. Funding - Pfizer, Sanofi-Aventis, Servier Laboratories UK, Merck Sharp & Dohme/Schering-Plough, Takeda UK, Roche, Merck Pharma, Daiichi-Sankyo UK, Boehringer Ingelheim, Eli Lilly, Novo Nordisk, Bristol-Myers Squibb, Solvay Health Care, and Assurance Medical Society UK.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Full text: The idea of producing proteins from recombinant DNA hatched almost half a century ago. In his PhD thesis, Peter Lobban foresaw the prospect of inserting foreign DNA (from any source, including mammalian cells) into the genome of a λ phage in order to detect and recover protein products from Escherichia coli [ 1 and 2]. Only a few years later, in 1977, Herbert Boyer and his colleagues succeeded in the first ever expression of a peptide-coding gene in E. coli — they produced recombinant somatostatin [ 3] followed shortly after by human insulin. The field has advanced enormously since those early days and today recombinant proteins have become indispensable in advancing research and development in all fields of the life sciences. Structural biology, in particular, has benefitted tremendously from recombinant protein biotechnology, and an overwhelming proportion of the entries in the Protein Data Bank (PDB) are based on heterologously expressed proteins. Nonetheless, synthesizing, purifying and stabilizing recombinant proteins can still be thoroughly challenging. For example, the soluble proteome is organized to a large part into multicomponent complexes (in humans often comprising ten or more subunits), posing critical challenges for recombinant production. A third of all proteins in cells are located in the membrane, and pose special challenges that require a more bespoke approach. Recent advances may now mean that even these most recalcitrant of proteins could become tenable structural biology targets on a more routine basis. In this special issue, we examine progress in key areas that suggests this is indeed the case. Our first contribution examines the importance of understanding quality control in the host cell during recombinant protein production, and pays particular attention to the synthesis of recombinant membrane proteins. A major challenge faced by any host cell factory is the balance it must strike between its own requirements for growth and the fact that its cellular machinery has essentially been hijacked by an expression construct. In this context, Bill and von der Haar examine emerging insights into the role of the dependent pathways of translation and protein folding in defining high-yielding recombinant membrane protein production experiments for the common prokaryotic and eukaryotic expression hosts. Rather than acting as isolated entities, many membrane proteins form complexes to carry out their functions. To understand their biological mechanisms, it is essential to study the molecular structure of the intact membrane protein assemblies. Recombinant production of membrane protein complexes is still a formidable, at times insurmountable, challenge. In these cases, extraction from natural sources is the only option to prepare samples for structural and functional studies. Zorman and co-workers, in our second contribution, provide an overview of recent advances in the production of multi-subunit membrane protein complexes and highlight recent achievements in membrane protein structural research brought about by state-of-the-art near-atomic resolution cryo-electron microscopy techniques. E. coli has been the dominant host cell for recombinant protein production. Nonetheless, eukaryotic expression systems, including yeasts, insect cells and mammalian cells, are increasingly gaining prominence in the field. The yeast species Pichia pastoris, is a well-established recombinant expression system for a number of applications, including the production of a range of different membrane proteins. Byrne reviews high-resolution structures that have been determined using this methylotroph as an expression host. Although it is not yet clear why P. pastoris is suited to producing such a wide range of membrane proteins, its ease of use and the availability of diverse tools that can be readily implemented in standard bioscience laboratories mean that it is likely to become an increasingly popular option in structural biology pipelines. The contribution by Columbus concludes the membrane protein section of this volume. In her overview of post-expression strategies, Columbus surveys the four most common biochemical approaches for the structural investigation of membrane proteins. Limited proteolysis has successfully aided structure determination of membrane proteins in many cases. Deglycosylation of membrane proteins following production and purification analysis has also facilitated membrane protein structure analysis. Moreover, chemical modifications, such as lysine methylation and cysteine alkylation, have proven their worth to facilitate crystallization of membrane proteins, as well as NMR investigations of membrane protein conformational sampling. Together these approaches have greatly facilitated the structure determination of more than 40 membrane proteins to date. It may be an advantage to produce a target protein in mammalian cells, especially if authentic post-translational modifications such as glycosylation are required for proper activity. Chinese Hamster Ovary (CHO) cells and Human Embryonic Kidney (HEK) 293 cell lines have emerged as excellent hosts for heterologous production. The generation of stable cell-lines is often an aspiration for synthesizing proteins expressed in mammalian cells, in particular if high volumetric yields are to be achieved. In his report, Buessow surveys recent structures of proteins produced using stable mammalian cells and summarizes both well-established and novel approaches to facilitate stable cell-line generation for structural biology applications. The ambition of many biologists is to observe a protein's structure in the native environment of the cell itself. Until recently, this seemed to be more of a dream than a reality. Advances in nuclear magnetic resonance (NMR) spectroscopy techniques, however, have now made possible the observation of mechanistic events at the molecular level of protein structure. Smith and colleagues, in an exciting contribution, review emerging ‘in-cell NMR’ techniques that demonstrate the potential to monitor biological activities by NMR in real time in native physiological environments. A current drawback of NMR as a structure determination tool derives from size limitations of the molecule under investigation and the structures of large proteins and their complexes are therefore typically intractable by NMR. A solution to this challenge is the use of selective isotope labeling of the target protein, which results in a marked reduction of the complexity of NMR spectra and allows dynamic processes even in very large proteins and even ribosomes to be investigated. Kerfah and co-workers introduce methyl-specific isotopic labeling as a molecular tool-box, and review its applications to the solution NMR analysis of large proteins. Tyagi and Lemke next examine single-molecule FRET and crosslinking following the co-translational incorporation of non-canonical amino acids (ncAAs); the goal here is to move beyond static snap-shots of proteins and their complexes and to observe them as dynamic entities. The encoding of ncAAs through codon-suppression technology allows biomolecules to be investigated with diverse structural biology methods. In their article, Tyagi and Lemke discuss these approaches and speculate on the design of improved host organisms for ‘integrative structural biology research’. Our volume concludes with two contributions that resolve particular bottlenecks in the protein structure determination pipeline. The contribution by Crepin and co-workers introduces the concept of polyproteins in contemporary structural biology. Polyproteins are widespread in nature. They represent long polypeptide chains in which individual smaller proteins with different biological function are covalently linked together. Highly specific proteases then tailor the polyprotein into its constituent proteins. Many viruses use polyproteins as a means of organizing their proteome. The concept of polyproteins has now been exploited successfully to produce hitherto inaccessible recombinant protein complexes. For instance, by means of a self-processing synthetic polyprotein, the influenza polymerase, a high-value drug target that had remained elusive for decades, has been produced, and its high-resolution structure determined. In the contribution by Desmyter and co-workers, a further, often imposing, bottleneck in high-resolution protein structure determination is addressed: The requirement to form stable three-dimensional crystal lattices that diffract incident X-ray radiation to high resolution. Nanobodies have proven to be uniquely useful as crystallization chaperones, to coax challenging targets into suitable crystal lattices. Desmyter and co-workers review the generation of nanobodies by immunization, and highlight the application of this powerful technology to the crystallography of important protein specimens including G protein-coupled receptors (GPCRs). Recombinant protein production has come a long way since Peter Lobban's hypothesis in the late 1960s, with recombinant proteins now a dominant force in structural biology. The contributions in this volume showcase an impressive array of inventive approaches that are being developed and implemented, ever increasing the scope of recombinant technology to facilitate the determination of elusive protein structures. Powerful new methods from synthetic biology are further accelerating progress. Structure determination is now reaching into the living cell with the ultimate goal of observing functional molecular architectures in action in their native physiological environment. We anticipate that even the most challenging protein assemblies will be tackled by recombinant technology in the near future.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The service-producing industries have experienced problems in quality in the 1980s because of intense competition. The author discusses how these problems have been compounded in the fast food industry and how quality control can lead to success.