97 resultados para grape juice quality control
Resumo:
OBJECTIVE: In order to improve the quality of our Emergency Medical Services (EMS), to raise bystander cardiopulmonary resuscitation rates and thereby meet what is becoming a universal standard in terms of quality of emergency services, we decided to implement systematic dispatcher-assisted or telephone-CPR (T-CPR) in our medical dispatch center, a non-Advanced Medical Priority Dispatch System. The aim of this article is to describe the implementation process, costs and results following the introduction of this new "quality" procedure. METHODS: This was a prospective study. Over an 8-week period, our EMS dispatchers were given new procedures to provide T-CPR. We then collected data on all non-traumatic cardiac arrests within our state (Vaud, Switzerland) for the following 12months. For each event, the dispatchers had to record in writing the reason they either ruled out cardiac arrest (CA) or did not propose T-CPR in the event they did suspect CA. All emergency call recordings were reviewed by the medical director of the EMS. The analysis of the recordings and the dispatchers' written explanations were then compared. RESULTS: During the 12-month study period, a total of 497 patients (both adults and children) were identified as having a non-traumatic cardiac arrest. Out of this total, 203 cases were excluded and 294 cases were eligible for T-CPR. Out of these eligible cases, dispatchers proposed T-CPR on 202 occasions (or 69% of eligible cases). They also erroneously proposed T-CPR on 17 occasions when a CA was wrongly identified (false positive). This represents 7.8% of all T-CPR. No costs were incurred to implement our study protocol and procedures. CONCLUSIONS: This study demonstrates it is possible, using a brief campaign of sensitization but without any specific training, to implement systematic dispatcher-assisted cardiopulmonary resuscitation in a non-Advanced Medical Priority Dispatch System such as our EMS that had no prior experience with systematic T-CPR. The results in terms of T-CPR delivery rate and false positive are similar to those found in previous studies. We found our results satisfying the given short time frame of this study. Our results demonstrate that it is possible to improve the quality of emergency services at moderate or even no additional costs and this should be of interest to all EMS that do not presently benefit from using T-CPR procedures. EMS that currently do not offer T-CPR should consider implementing this technique as soon as possible, and we expect our experience may provide answers to those planning to incorporate T-CPR in their daily practice.
Resumo:
The mission of the Encyclopedia of DNA Elements (ENCODE) Project is to enable the scientific and medical communities to interpret the human genome sequence and apply it to understand human biology and improve health. The ENCODE Consortium is integrating multiple technologies and approaches in a collective effort to discover and define the functional elements encoded in the human genome, including genes, transcripts, and transcriptional regulatory regions, together with their attendant chromatin states and DNA methylation patterns. In the process, standards to ensure high-quality data have been implemented, and novel algorithms have been developed to facilitate analysis. Data and derived results are made available through a freely accessible database. Here we provide an overview of the project and the resources it is generating and illustrate the application of ENCODE data to interpret the human genome.
Resumo:
This study represents the most extensive analysis of batch-to-batch variations in spray paint samples to date. The survey was performed as a collaborative project of the ENFSI (European Network of Forensic Science Institutes) Paint and Glass Working Group (EPG) and involved 11 laboratories. Several studies have already shown that paint samples of similar color but from different manufacturers can usually be differentiated using an appropriate analytical sequence. The discrimination of paints from the same manufacturer and color (batch-to-batch variations) is of great interest and these data are seldom found in the literature. This survey concerns the analysis of batches from different color groups (white, papaya (special shade of orange), red and black) with a wide range of analytical techniques and leads to the following conclusions. Colored batch samples are more likely to be differentiated since their pigment composition is more complex (pigment mixtures, added pigments) and therefore subject to variations. These variations may occur during the paint production but may also occur when checking the paint shade in quality control processes. For these samples, techniques aimed at color/pigment(s) characterization (optical microscopy, microspectrophotometry (MSP), Raman spectroscopy) provide better discrimination than techniques aimed at the organic (binder) or inorganic composition (fourier transform infrared spectroscopy (FTIR) or elemental analysis (SEM - scanning electron microscopy and XRF - X-ray fluorescence)). White samples contain mainly titanium dioxide as a pigment and the main differentiation is based on the binder composition (Csingle bondH stretches) detected either by FTIR or Raman. The inorganic composition (elemental analysis) also provides some discrimination. Black samples contain mainly carbon black as a pigment and are problematic with most of the spectroscopic techniques. In this case, pyrolysis-GC/MS represents the best technique to detect differences. Globally, Py-GC/MS may show a high potential of discrimination on all samples but the results are highly dependent on the specific instrumental conditions used. Finally, the discrimination of samples when data was interpreted visually as compared to statistically using principal component analysis (PCA) yielded very similar results. PCA increases sensitivity and could perform better on specific samples, but one first has to ensure that all non-informative variation (baseline deviation) is eliminated by applying correct pre-treatments. Statistical treatments can be used on a large data set and, when combined with an expert's opinion, will provide more objective criteria for decision making.
Resumo:
OBJECTIVE: Studies of major depression in twins and families have shown moderate to high heritability, but extensive molecular studies have failed to identify susceptibility genes convincingly. To detect genetic variants contributing to major depression, the authors performed a genome-wide association study using 1,636 cases of depression ascertained in the U.K. and 1,594 comparison subjects screened negative for psychiatric disorders. METHOD: Cases were collected from 1) a case-control study of recurrent depression (the Depression Case Control [DeCC] study; N=1346), 2) an affected sibling pair linkage study of recurrent depression (probands from the Depression Network [DeNT] study; N=332), and 3) a pharmacogenetic study (the Genome-Based Therapeutic Drugs for Depression [GENDEP] study; N=88). Depression cases and comparison subjects were genotyped at Centre National de Génotypage on the Illumina Human610-Quad BeadChip. After applying stringent quality control criteria for missing genotypes, departure from Hardy-Weinberg equilibrium, and low minor allele frequency, the authors tested for association to depression using logistic regression, correcting for population ancestry. RESULTS: Single nucleotide polymorphisms (SNPs) in BICC1 achieved suggestive evidence for association, which strengthened after imputation of ungenotyped markers, and in analysis of female depression cases. A meta-analysis of U.K. data with previously published results from studies in Munich and Lausanne showed some evidence for association near neuroligin 1 (NLGN1) on chromosome 3, but did not support findings at BICC1. CONCLUSIONS: This study identifies several signals for association worthy of further investigation but, as in previous genome-wide studies, suggests that individual gene contributions to depression are likely to have only minor effects, and very large pooled analyses will be required to identify them.
Resumo:
BACKGROUND: "Virtual" autopsy by postmortem computed tomography (PMCT) can replace medical autopsy to a certain extent but has limitations for cardiovascular diseases. These limitations might be overcome by adding multiphase PMCT angiography. OBJECTIVE: To compare virtual autopsy by multiphase PMCT angiography with medical autopsy. DESIGN: Prospective cohort study. (ClinicalTrials.gov: NCT01541995) SETTING: Single-center study at the University Medical Center Hamburg-Eppendorf, Hamburg, Germany, between 1 April 2012 and 31 March 2013. PATIENTS: Hospitalized patients who died unexpectedly or within 48 hours of an event necessitating cardiopulmonary resuscitation. MEASUREMENTS: Diagnoses from clinical records were compared with findings from both types of autopsy. New diagnoses identified by autopsy were classified as major or minor, depending on whether they would have altered clinical management. RESULTS: Of 143 eligible patients, 50 (35%) had virtual and medical autopsy. Virtual autopsy confirmed 93% of all 336 diagnoses identified from antemortem medical records, and medical autopsy confirmed 80%. In addition, virtual and medical autopsy identified 16 new major and 238 new minor diagnoses. Seventy-three of the virtual autopsy diagnoses, including 32 cases of coronary artery stenosis, were identified solely by multiphase PMCT angiography. Of the 114 clinical diagnoses classified as cardiovascular, 110 were confirmed by virtual autopsy and 107 by medical autopsy. In 11 cases, multiphase PMCT angiography showed "unspecific filling defects," which were not reported by medical autopsy. LIMITATION: These results come from a single center with concerted interest and expertise in postmortem imaging; further studies are thus needed for generalization. CONCLUSION: In cases of unexpected death, the addition of multiphase PMCT angiography increases the value of virtual autopsy, making it a feasible alternative for quality control and identification of diagnoses traditionally made by medical autopsy. PRIMARY FUNDING SOURCE: University Medical Center Hamburg-Eppendorf.
Resumo:
Given the adverse impact of image noise on the perception of important clinical details in digital mammography, routine quality control measurements should include an evaluation of noise. The European Guidelines, for example, employ a second-order polynomial fit of pixel variance as a function of detector air kerma (DAK) to decompose noise into quantum, electronic and fixed pattern (FP) components and assess the DAK range where quantum noise dominates. This work examines the robustness of the polynomial method against an explicit noise decomposition method. The two methods were applied to variance and noise power spectrum (NPS) data from six digital mammography units. Twenty homogeneously exposed images were acquired with PMMA blocks for target DAKs ranging from 6.25 to 1600 µGy. Both methods were explored for the effects of data weighting and squared fit coefficients during the curve fitting, the influence of the additional filter material (2 mm Al versus 40 mm PMMA) and noise de-trending. Finally, spatial stationarity of noise was assessed.Data weighting improved noise model fitting over large DAK ranges, especially at low detector exposures. The polynomial and explicit decompositions generally agreed for quantum and electronic noise but FP noise fraction was consistently underestimated by the polynomial method. Noise decomposition as a function of position in the image showed limited noise stationarity, especially for FP noise; thus the position of the region of interest (ROI) used for noise decomposition may influence fractional noise composition. The ROI area and position used in the Guidelines offer an acceptable estimation of noise components. While there are limitations to the polynomial model, when used with care and with appropriate data weighting, the method offers a simple and robust means of examining the detector noise components as a function of detector exposure.
Resumo:
In most pathology laboratories worldwide, formalin-fixed paraffin embedded (FFPE) samples are the only tissue specimens available for routine diagnostics. Although commercial kits for diagnostic molecular pathology testing are becoming available, most of the current diagnostic tests are laboratory-based assays. Thus, there is a need for standardized procedures in molecular pathology, starting from the extraction of nucleic acids. To evaluate the current methods for extracting nucleic acids from FFPE tissues, 13 European laboratories, participating to the European FP6 program IMPACTS (www.impactsnetwork.eu), isolated nucleic acids from four diagnostic FFPE tissues using their routine methods, followed by quality assessment. The DNA-extraction protocols ranged from homemade protocols to commercial kits. Except for one homemade protocol, the majority gave comparable results in terms of the quality of the extracted DNA measured by the ability to amplify differently sized control gene fragments by PCR. For array-applications or tests that require an accurately determined DNA-input, we recommend using silica based adsorption columns for DNA recovery. For RNA extractions, the best results were obtained using chromatography column based commercial kits, which resulted in the highest quantity and best assayable RNA. Quality testing using RT-PCR gave successful amplification of 200 bp-250 bp PCR products from most tested tissues. Modifications of the proteinase-K digestion time led to better results, even when commercial kits were applied. The results of the study emphasize the need for quality control of the nucleic acid extracts with standardised methods to prevent false negative results and to allow data comparison among different diagnostic laboratories.
Resumo:
The Gene Ontology (GO) (http://www.geneontology.org) is a community bioinformatics resource that represents gene product function through the use of structured, controlled vocabularies. The number of GO annotations of gene products has increased due to curation efforts among GO Consortium (GOC) groups, including focused literature-based annotation and ortholog-based functional inference. The GO ontologies continue to expand and improve as a result of targeted ontology development, including the introduction of computable logical definitions and development of new tools for the streamlined addition of terms to the ontology. The GOC continues to support its user community through the use of e-mail lists, social media and web-based resources.
Resumo:
The Haemophilia Registry of the Swiss Haemophilia Society was created in the year 2000. The latest records from October 31st 2011 are presented here. Included are all patients with haemophilia A or B and other inherited coagulation disorders (including VWD patients with R-Co activity below 10%) known and followed by the 11 paediatric and 12 adult haemophilia treatment or reference centers. Currently there are 950 patients registered, the majority of which (585) having haemophilia A. Disease severity is graded according to ISTH criteria and its distribution between mild, moderate and severe haemophilia is similar to data from other European and American registries. The majority (about two thirds) of Swiss patients with haemophilia A or B are treated on-demand, with only about 20% of patients being on prophylaxis. The figure is different in paediatrics and young adults (1st and 2nd decades), where 80 to 90% of patients with haemophilia A are under regular prophylaxis. Interestingly enough, use of factor concentrates, although readily available, is rather low in Switzerland, especially when taking the country's GDP into account: The total amount of factor VIII and IX was 4.94 U pro capita, comparable to other European countries with distinctly lower incomes (Poland, Slovakia, Hungary). This finding is mainly due to the afore mentioned low rate of prophylactic treatment of haemophilia in our country. Our registry remains an important instrument of quality control of haemophilia therapy in Switzerland.
Resumo:
INTRODUCTION: In November 2009, the "3rd Summit on Osteoporosis-Central and Eastern Europe (CEE)" was held in Budapest, Hungary. The conference aimed to tackle issues regarding osteoporosis management in CEE identified during the second CEE summit in 2008 and to agree on approaches that allow most efficient and cost-effective diagnosis and therapy of osteoporosis in CEE countries in the future. DISCUSSION: The following topics were covered: past year experience from FRAX® implementation into local diagnostic algorithms; causes of secondary osteoporosis as a FRAX® risk factor; bone turnover markers to estimate bone loss, fracture risk, or monitor therapies; role of quantitative ultrasound in osteoporosis management; compliance and economical aspects of osteoporosis; and osteoporosis and genetics. Consensus and recommendations developed on these topics are summarised in the present progress report. CONCLUSION: Lectures on up-to-date data of topical interest, the distinct regional provenances of the participants, a special focus on practical aspects, intense mutual exchange of individual experiences, strong interest in cross-border cooperations, as well as the readiness to learn from each other considerably contributed to the establishment of these recommendations. The "4th Summit on Osteoporosis-CEE" held in Prague, Czech Republic, in December 2010 will reveal whether these recommendations prove of value when implemented in the clinical routine or whether further improvements are still required.
Resumo:
The Swiss Haemophilia Registry of the Medical Committee of the Swiss Haemophilia Society was established in 2000. Primarily it bears epidemiological and basic clinical data (incidence, type and severity of the disease, age groups, centres, mortality). Two thirds of the questions of the WFH Global Survey can be answered, especially those concerning use of concentrates (global, per capita) and treatment modalities (on-demand versus prophylactic regimens). Moreover, the registry is an important tool for quality control of the haemophilia treatment centres. There are no informations about infectious diseases like hepatitis or HIV, due to non-anonymisation of the data. We plan to incorporate the results of the mutation analysis in the future.
Resumo:
A simple and sensitive liquid chromatography-electrospray ionization mass spectrometry method was developed for the simultaneous quantification in human plasma of all selective serotonin reuptake inhibitors (citalopram, fluoxetine, fluvoxamine, paroxetine and sertraline) and their main active metabolites (desmethyl-citalopram and norfluoxetine). A stable isotope-labeled internal standard was used for each analyte to compensate for the global method variability, including extraction and ionization variations. After sample (250μl) pre-treatment with acetonitrile (500μl) to precipitate proteins, a fast solid-phase extraction procedure was performed using mixed mode Oasis MCX 96-well plate. Chromatographic separation was achieved in less than 9.0min on a XBridge C18 column (2.1×100mm; 3.5μm) using a gradient of ammonium acetate (pH 8.1; 50mM) and acetonitrile as mobile phase at a flow rate of 0.3ml/min. The method was fully validated according to Société Française des Sciences et Techniques Pharmaceutiques protocols and the latest Food and Drug Administration guidelines. Six point calibration curves were used to cover a large concentration range of 1-500ng/ml for citalopram, desmethyl-citalopram, paroxetine and sertraline, 1-1000ng/ml for fluoxetine and fluvoxamine, and 2-1000ng/ml for norfluoxetine. Good quantitative performances were achieved in terms of trueness (84.2-109.6%), repeatability (0.9-14.6%) and intermediate precision (1.8-18.0%) in the entire assay range including the lower limit of quantification. Internal standard-normalized matrix effects were lower than 13%. The accuracy profiles (total error) were mainly included in the acceptance limits of ±30% for biological samples. The method was successfully applied for routine therapeutic drug monitoring of more than 1600 patient plasma samples over 9 months. The β-expectation tolerance intervals determined during the validation phase were coherent with the results of quality control samples analyzed during routine use. This method is therefore precise and suitable both for therapeutic drug monitoring and pharmacokinetic studies in most clinical laboratories.
Resumo:
Background and objective: Patients in the ICU often get many intravenous (iv) drugs at the same time. Even with three-lumen central venous catheters, the administration of more than one drug in the same iv line (IVL) is frequently necessary. The objective of this study was to observe how nurses managed to administer these many medications and to evaluate the proportion of two-drugs associations (TDA) that are compatible or not, based on known compatibility data. Design: Observational prospective study over 4 consecutive months. All patients receiving simultaneously more than one drugs in the same IVL (Y-site injection or mixed in the same container) were included. For each patient, all iv drugs were recorded, as well as concentration, infusion solution, location on the IVL system, time, rate and duration of administration. For each association of two or more drugs, compatibility of each drug was checked with each other. Compatibilities between these pairs of drugs were assessed using published data (mainly Trissel LA. Handbook on Injectable Drugs and Trissel's Tables of Physical Compatibility) and visual tests performed in our quality control laboratory. Setting: 34 beds university hospital adult ICU. Main outcome measures: Percentage of compatibilities and incompatibilities between drugs administered in the same IVL. Results: We observed 1,913 associations of drugs administered together in the same IVL, 783 implying only two drugs. The average number of drugs per IVL was 3.1 ± 0.8 (range: 2-9). 83.2% of the drugs were given by continuous infusion, 14.3% by intermittent infusion and 2.5% in bolus. The associations observed allowed to form 8,421 pairs of drugs (71.7% drug-drug and 28.3% drug-solute). According to literature data, 80.2% of the association were considered as compatible and 4.4% incompatible. 15.4% were not interpretable because of different conditions between local practices and those described in the literature (drug concentration, solute, etc.) or because of a lack of data. After laboratory tests performed on the most used drugs (furosemide, KH2PO4, morphine HCl, etc.), the proportion of compatible TDA raised to 85.7%, the incompatible stayed at 4.6% and only 9.7% remain unknown or not interpretable. Conclusions: Nurses managed the administration of iv medications quite well, as only less than 5% of observed TDA were considered as incompatible. But the 10% of TDA with unavailable compatibility data should have been avoided too, since the consequences of their concomitant administration cannot be predictable. For practical reasons, drugs were analysed only by pairs, which constitutes the main limit of this work. The average number of drugs in the same association being three, laboratory tests are currently performed to evaluate some of the most observed three-drugs associations.
Resumo:
Peripheral assessment of bone density using photon absorptiometry techniques has been available for over 40 yr. The initial use of radio-isotopes as the photon source has been replaced by the use of X-ray technology. A wide variety of models of single- or dual-energy X-ray measurement tools have been made available for purchase, although not all are still commercially available. The Official Positions of the International Society for Clinical Densitometry (ISCD) have been developed following a systematic review of the literature by an ISCD task force and a subsequent Position Development Conference. These cover the technological diversity among peripheral dual-energy X-ray absorptiometry (pDXA) devices; define whether pDXA can be used for fracture risk assessment and/or to diagnose osteoporosis; examine whether pDXA can be used to initiate treatment and/or monitor treatment; provide recommendations for pDXA reporting; and review quality assurance and quality control necessary for effective use of pDXA.