935 resultados para Electronic data processing -- Quality control


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Obesity is recognised as a global epidemic and the most prevalent metabolic disease world-wide. Specialised obesity services, however, are not widely available in Europe, and obesity care can vary enormously across European regions. The European Association for the Study of Obesity (EASO, www.easo.org) has developed these criteria to form a pan-European network of accredited EASO-Collaborating Centres for Obesity Management (EASO-COMs) in accordance with accepted European and academic guidelines. This network will include university, public and private clinics and will ensure that the obese and overweight patient is managed by a holistic team of specialists and receives comprehensive state-ofthe-art clinical care. Furthermore, the participating centres, under the umbrella of EASO, will work closely for quality control, data collection, and analysis as well as for education and research for the advancement of obesity care and obesity science.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE: In order to improve the quality of our Emergency Medical Services (EMS), to raise bystander cardiopulmonary resuscitation rates and thereby meet what is becoming a universal standard in terms of quality of emergency services, we decided to implement systematic dispatcher-assisted or telephone-CPR (T-CPR) in our medical dispatch center, a non-Advanced Medical Priority Dispatch System. The aim of this article is to describe the implementation process, costs and results following the introduction of this new "quality" procedure. METHODS: This was a prospective study. Over an 8-week period, our EMS dispatchers were given new procedures to provide T-CPR. We then collected data on all non-traumatic cardiac arrests within our state (Vaud, Switzerland) for the following 12months. For each event, the dispatchers had to record in writing the reason they either ruled out cardiac arrest (CA) or did not propose T-CPR in the event they did suspect CA. All emergency call recordings were reviewed by the medical director of the EMS. The analysis of the recordings and the dispatchers' written explanations were then compared. RESULTS: During the 12-month study period, a total of 497 patients (both adults and children) were identified as having a non-traumatic cardiac arrest. Out of this total, 203 cases were excluded and 294 cases were eligible for T-CPR. Out of these eligible cases, dispatchers proposed T-CPR on 202 occasions (or 69% of eligible cases). They also erroneously proposed T-CPR on 17 occasions when a CA was wrongly identified (false positive). This represents 7.8% of all T-CPR. No costs were incurred to implement our study protocol and procedures. CONCLUSIONS: This study demonstrates it is possible, using a brief campaign of sensitization but without any specific training, to implement systematic dispatcher-assisted cardiopulmonary resuscitation in a non-Advanced Medical Priority Dispatch System such as our EMS that had no prior experience with systematic T-CPR. The results in terms of T-CPR delivery rate and false positive are similar to those found in previous studies. We found our results satisfying the given short time frame of this study. Our results demonstrate that it is possible to improve the quality of emergency services at moderate or even no additional costs and this should be of interest to all EMS that do not presently benefit from using T-CPR procedures. EMS that currently do not offer T-CPR should consider implementing this technique as soon as possible, and we expect our experience may provide answers to those planning to incorporate T-CPR in their daily practice.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The mission of the Encyclopedia of DNA Elements (ENCODE) Project is to enable the scientific and medical communities to interpret the human genome sequence and apply it to understand human biology and improve health. The ENCODE Consortium is integrating multiple technologies and approaches in a collective effort to discover and define the functional elements encoded in the human genome, including genes, transcripts, and transcriptional regulatory regions, together with their attendant chromatin states and DNA methylation patterns. In the process, standards to ensure high-quality data have been implemented, and novel algorithms have been developed to facilitate analysis. Data and derived results are made available through a freely accessible database. Here we provide an overview of the project and the resources it is generating and illustrate the application of ENCODE data to interpret the human genome.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE: Studies of major depression in twins and families have shown moderate to high heritability, but extensive molecular studies have failed to identify susceptibility genes convincingly. To detect genetic variants contributing to major depression, the authors performed a genome-wide association study using 1,636 cases of depression ascertained in the U.K. and 1,594 comparison subjects screened negative for psychiatric disorders. METHOD: Cases were collected from 1) a case-control study of recurrent depression (the Depression Case Control [DeCC] study; N=1346), 2) an affected sibling pair linkage study of recurrent depression (probands from the Depression Network [DeNT] study; N=332), and 3) a pharmacogenetic study (the Genome-Based Therapeutic Drugs for Depression [GENDEP] study; N=88). Depression cases and comparison subjects were genotyped at Centre National de Génotypage on the Illumina Human610-Quad BeadChip. After applying stringent quality control criteria for missing genotypes, departure from Hardy-Weinberg equilibrium, and low minor allele frequency, the authors tested for association to depression using logistic regression, correcting for population ancestry. RESULTS: Single nucleotide polymorphisms (SNPs) in BICC1 achieved suggestive evidence for association, which strengthened after imputation of ungenotyped markers, and in analysis of female depression cases. A meta-analysis of U.K. data with previously published results from studies in Munich and Lausanne showed some evidence for association near neuroligin 1 (NLGN1) on chromosome 3, but did not support findings at BICC1. CONCLUSIONS: This study identifies several signals for association worthy of further investigation but, as in previous genome-wide studies, suggests that individual gene contributions to depression are likely to have only minor effects, and very large pooled analyses will be required to identify them.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Given the adverse impact of image noise on the perception of important clinical details in digital mammography, routine quality control measurements should include an evaluation of noise. The European Guidelines, for example, employ a second-order polynomial fit of pixel variance as a function of detector air kerma (DAK) to decompose noise into quantum, electronic and fixed pattern (FP) components and assess the DAK range where quantum noise dominates. This work examines the robustness of the polynomial method against an explicit noise decomposition method. The two methods were applied to variance and noise power spectrum (NPS) data from six digital mammography units. Twenty homogeneously exposed images were acquired with PMMA blocks for target DAKs ranging from 6.25 to 1600 µGy. Both methods were explored for the effects of data weighting and squared fit coefficients during the curve fitting, the influence of the additional filter material (2 mm Al versus 40 mm PMMA) and noise de-trending. Finally, spatial stationarity of noise was assessed.Data weighting improved noise model fitting over large DAK ranges, especially at low detector exposures. The polynomial and explicit decompositions generally agreed for quantum and electronic noise but FP noise fraction was consistently underestimated by the polynomial method. Noise decomposition as a function of position in the image showed limited noise stationarity, especially for FP noise; thus the position of the region of interest (ROI) used for noise decomposition may influence fractional noise composition. The ROI area and position used in the Guidelines offer an acceptable estimation of noise components. While there are limitations to the polynomial model, when used with care and with appropriate data weighting, the method offers a simple and robust means of examining the detector noise components as a function of detector exposure.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In most pathology laboratories worldwide, formalin-fixed paraffin embedded (FFPE) samples are the only tissue specimens available for routine diagnostics. Although commercial kits for diagnostic molecular pathology testing are becoming available, most of the current diagnostic tests are laboratory-based assays. Thus, there is a need for standardized procedures in molecular pathology, starting from the extraction of nucleic acids. To evaluate the current methods for extracting nucleic acids from FFPE tissues, 13 European laboratories, participating to the European FP6 program IMPACTS (www.impactsnetwork.eu), isolated nucleic acids from four diagnostic FFPE tissues using their routine methods, followed by quality assessment. The DNA-extraction protocols ranged from homemade protocols to commercial kits. Except for one homemade protocol, the majority gave comparable results in terms of the quality of the extracted DNA measured by the ability to amplify differently sized control gene fragments by PCR. For array-applications or tests that require an accurately determined DNA-input, we recommend using silica based adsorption columns for DNA recovery. For RNA extractions, the best results were obtained using chromatography column based commercial kits, which resulted in the highest quantity and best assayable RNA. Quality testing using RT-PCR gave successful amplification of 200 bp-250 bp PCR products from most tested tissues. Modifications of the proteinase-K digestion time led to better results, even when commercial kits were applied. The results of the study emphasize the need for quality control of the nucleic acid extracts with standardised methods to prevent false negative results and to allow data comparison among different diagnostic laboratories.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

INTRODUCTION: In November 2009, the "3rd Summit on Osteoporosis-Central and Eastern Europe (CEE)" was held in Budapest, Hungary. The conference aimed to tackle issues regarding osteoporosis management in CEE identified during the second CEE summit in 2008 and to agree on approaches that allow most efficient and cost-effective diagnosis and therapy of osteoporosis in CEE countries in the future. DISCUSSION: The following topics were covered: past year experience from FRAX® implementation into local diagnostic algorithms; causes of secondary osteoporosis as a FRAX® risk factor; bone turnover markers to estimate bone loss, fracture risk, or monitor therapies; role of quantitative ultrasound in osteoporosis management; compliance and economical aspects of osteoporosis; and osteoporosis and genetics. Consensus and recommendations developed on these topics are summarised in the present progress report. CONCLUSION: Lectures on up-to-date data of topical interest, the distinct regional provenances of the participants, a special focus on practical aspects, intense mutual exchange of individual experiences, strong interest in cross-border cooperations, as well as the readiness to learn from each other considerably contributed to the establishment of these recommendations. The "4th Summit on Osteoporosis-CEE" held in Prague, Czech Republic, in December 2010 will reveal whether these recommendations prove of value when implemented in the clinical routine or whether further improvements are still required.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Contamination of weather radar echoes by anomalous propagation (anaprop) mechanisms remains a serious issue in quality control of radar precipitation estimates. Although significant progress has been made identifying clutter due to anaprop there is no unique method that solves the question of data reliability without removing genuine data. The work described here relates to the development of a software application that uses a numerical weather prediction (NWP) model to obtain the temperature, humidity and pressure fields to calculate the three dimensional structure of the atmospheric refractive index structure, from which a physically based prediction of the incidence of clutter can be made. This technique can be used in conjunction with existing methods for clutter removal by modifying parameters of detectors or filters according to the physical evidence for anomalous propagation conditions. The parabolic equation method (PEM) is a well established technique for solving the equations for beam propagation in a non-uniformly stratified atmosphere, but although intrinsically very efficient, is not sufficiently fast to be practicable for near real-time modelling of clutter over the entire area observed by a typical weather radar. We demonstrate a fast hybrid PEM technique that is capable of providing acceptable results in conjunction with a high-resolution terrain elevation model, using a standard desktop personal computer. We discuss the performance of the method and approaches for the improvement of the model profiles in the lowest levels of the troposphere.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Quality control (QuaCo) in urology is mandatory to standardize or even increase the level of care. While QuaCo is undertaken at every step in the clinical pathway, it should focus on the patient's comorbidities and on the urologist and its complication rate. Resulting from political and economical pressures, comparing QuaCo and outcomes between urologists and institutions is nowadays often performed. However, careful interpretation of these comparisons is mandatory to avoid potential discriminations. Indeed, the reader has to make sure that patients groups and surgical techniques are comparable, definitions of complications are similar, classification of complications is standardized, and finally that the methodology in collecting data is irreproachable.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Neuronal oscillations are an important aspect of EEG recordings. These oscillations are supposed to be involved in several cognitive mechanisms. For instance, oscillatory activity is considered a key component for the top-down control of perception. However, measuring this activity and its influence requires precise extraction of frequency components. This processing is not straightforward. Particularly, difficulties with extracting oscillations arise due to their time-varying characteristics. Moreover, when phase information is needed, it is of the utmost importance to extract narrow-band signals. This paper presents a novel method using adaptive filters for tracking and extracting these time-varying oscillations. This scheme is designed to maximize the oscillatory behavior at the output of the adaptive filter. It is then capable of tracking an oscillation and describing its temporal evolution even during low amplitude time segments. Moreover, this method can be extended in order to track several oscillations simultaneously and to use multiple signals. These two extensions are particularly relevant in the framework of EEG data processing, where oscillations are active at the same time in different frequency bands and signals are recorded with multiple sensors. The presented tracking scheme is first tested with synthetic signals in order to highlight its capabilities. Then it is applied to data recorded during a visual shape discrimination experiment for assessing its usefulness during EEG processing and in detecting functionally relevant changes. This method is an interesting additional processing step for providing alternative information compared to classical time-frequency analyses and for improving the detection and analysis of cross-frequency couplings.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Basal ganglia and brain stem nuclei are involved in the pathophysiology of various neurological and neuropsychiatric disorders. Currently available structural T1-weighted (T1w) magnetic resonance images do not provide sufficient contrast for reliable automated segmentation of various subcortical grey matter structures. We use a novel, semi-quantitative magnetization transfer (MT) imaging protocol that overcomes limitations in T1w images, which are mainly due to their sensitivity to the high iron content in subcortical grey matter. We demonstrate improved automated segmentation of putamen, pallidum, pulvinar and substantia nigra using MT images. A comparison with segmentation of high-quality T1w images was performed in 49 healthy subjects. Our results show that MT maps are highly suitable for automated segmentation, and so for multi-subject morphometric studies with a focus on subcortical structures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The goal of this work is to develop a method to objectively compare the performance of a digital and a screen-film mammography system in terms of image quality. The method takes into account the dynamic range of the image detector, the detection of high and low contrast structures, the visualisation of the images and the observer response. A test object, designed to represent a compressed breast, was constructed from various tissue equivalent materials ranging from purely adipose to purely glandular composition. Different areas within the test object permitted the evaluation of low and high contrast detection, spatial resolution and image noise. All the images (digital and conventional) were captured using a CCD camera to include the visualisation process in the image quality assessment. A mathematical model observer (non-prewhitening matched filter), that calculates the detectability of high and low contrast structures using spatial resolution, noise and contrast, was used to compare the two technologies. Our results show that for a given patient dose, the detection of high and low contrast structures is significantly better for the digital system than for the conventional screen-film system studied. The method of using a test object with a large tissue composition range combined with a camera to compare conventional and digital imaging modalities can be applied to other radiological imaging techniques. In particular it could be used to optimise the process of radiographic reading of soft copy images.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

PURPOSE: Almost five years have elapsed since the introduction of latanoprost on several markets and considering the large number of publications dealing with it, the authors felt that it was worth re-evaluating the drug. METHODS: The criterion used to select trials for inclusion in the review was: all articles mentioning the drug in common electronic data-bases; these were then screened and considered, on the basis of methodological quality. RESULTS: Experimental data suggest that latanoprost acts by remodeling the extracellular matrix in the ciliary muscle, thus increasing the flow of aqueous humor through the ciliary muscle bundles of the uveoscleral pathway. POAG: Latanoprost persistently improves the pulsatile ocular blood flow in primary open angle glaucoma (POAG). Recent trials confirmed the greater IOP-lowering efficacy of latanoprost vs. timolol, dorzolamide, brimonidine and unoprostone. Trials lasting up to 24 months showed that latanoprost is effective in long-term treatment of POAG and ocular hypertension (OH), with no signs of loss of efficacy when compared to timolol or dorzolamide. Latanoprost provides better control of circadian IOP. Non-responders to beta-blockers should preferably be switched to latanoprost monotherapy before a combination therapy is started. The possibility of a fixed combination of latanoprost and timolol has been explored, with promising results. NTG: Latanoprost is effective in normal tension glaucoma (NTG), lowering IOP, improving pulsatile ocular blood flow and increasing ocular perfusion pressure. OTHER GLAUCOMAS: Latanoprost may provide effective IOP control in angle-closure glaucoma after iridectomy, in pigmentary glaucoma, glaucoma after cataract extraction and steroid-induced glaucoma. However, latanoprost was effective in only a minority of pediatric cases of glaucoma and is contraindicated in all forms of uveitic glaucoma. SAFETY: In the articles reviewed, new or duration-related adverse events were reported.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

L’objecte del present treball és la realització d’una aplicació que permeti portar a terme el control estadístic multivariable en línia d’una planta SBR.Aquesta eina ha de permetre realitzar un anàlisi estadístic multivariable complet del lot en procés, de l’últim lot finalitzat i de la resta de lots processats a la planta.L’aplicació s’ha de realitzar en l’entorn LabVIEW. L’elecció d’aquest programa vecondicionada per l’actualització del mòdul de monitorització de la planta que s’estàdesenvolupant en aquest mateix entorn