46 resultados para Validation and certification competences process
Resumo:
BACKGROUND Recently, two simple clinical scores were published to predict survival in trauma patients. Both scores may successfully guide major trauma triage, but neither has been independently validated in a hospital setting. METHODS This is a cohort study with 30-day mortality as the primary outcome to validate two new trauma scores-Mechanism, Glasgow Coma Scale (GCS), Age, and Pressure (MGAP) score and GCS, Age and Pressure (GAP) score-using data from the UK Trauma Audit and Research Network. First, an assessment of discrimination, using the area under the receiver operating characteristic (ROC) curve, and calibration, comparing mortality rates with those originally published, were performed. Second, we calculated sensitivity, specificity, predictive values, and likelihood ratios for prognostic score performance. Third, we propose new cutoffs for the risk categories. RESULTS A total of 79,807 adult (≥16 years) major trauma patients (2000-2010) were included; 5,474 (6.9%) died. Mean (SD) age was 51.5 (22.4) years, median GCS score was 15 (interquartile range, 15-15), and median Injury Severity Score (ISS) was 9 (interquartile range, 9-16). More than 50% of the patients had a low-risk GAP or MGAP score (1% mortality). With regard to discrimination, areas under the ROC curve were 87.2% for GAP score (95% confidence interval, 86.7-87.7) and 86.8% for MGAP score (95% confidence interval, 86.2-87.3). With regard to calibration, 2,390 (3.3%), 1,900 (28.5%), and 1,184 (72.2%) patients died in the low, medium, and high GAP risk categories, respectively. In the low- and medium-risk groups, these were almost double the previously published rates. For MGAP, 1,861 (2.8%), 1,455 (15.2%), and 2,158 (58.6%) patients died in the low-, medium-, and high-risk categories, consonant with results originally published. Reclassifying score point cutoffs improved likelihood ratios, sensitivity and specificity, as well as areas under the ROC curve. CONCLUSION We found both scores to be valid triage tools to stratify emergency department patients, according to their risk of death. MGAP calibrated better, but GAP slightly improved discrimination. The newly proposed cutoffs better differentiate risk classification and may therefore facilitate hospital resource allocation. LEVEL OF EVIDENCE Prognostic study, level II.
Resumo:
We present a novel surrogate model-based global optimization framework allowing a large number of function evaluations. The method, called SpLEGO, is based on a multi-scale expected improvement (EI) framework relying on both sparse and local Gaussian process (GP) models. First, a bi-objective approach relying on a global sparse GP model is used to determine potential next sampling regions. Local GP models are then constructed within each selected region. The method subsequently employs the standard expected improvement criterion to deal with the exploration-exploitation trade-off within selected local models, leading to a decision on where to perform the next function evaluation(s). The potential of our approach is demonstrated using the so-called Sparse Pseudo-input GP as a global model. The algorithm is tested on four benchmark problems, whose number of starting points ranges from 102 to 104. Our results show that SpLEGO is effective and capable of solving problems with large number of starting points, and it even provides significant advantages when compared with state-of-the-art EI algorithms.
Resumo:
OBJECTIVES Chewing efficiency may be evaluated using cohesive specimen, especially in elderly or dysphagic patients. The aim of this study was to evaluate three two-coloured chewing gums for a colour-mixing ability test and to validate a new purpose built software (ViewGum©). METHODS Dentate participants (dentate-group) and edentulous patients with mandibular two-implant overdentures (IOD-group) were recruited. First, the dentate-group chewed three different types of two-coloured gum (gum1-gum3) for 5, 10, 20, 30 and 50 chewing cycles. Subsequently the number of chewing cycles with the highest intra- and inter-rater agreement was determined visually by applying a scale (SA) and opto-electronically (ViewGum©, Bland-Altman analysis). The ViewGum© software determines semi-automatically the variance of hue (VOH); inadequate mixing presents with larger VOH than complete mixing. Secondly, the dentate-group and the IOD-group were compared. RESULTS The dentate-group comprised 20 participants (10 female, 30.3±6.7 years); the IOD-group 15 participants (10 female, 74.6±8.3 years). Intra-rater and inter-rater agreement (SA) was very high at 20 chewing cycles (95.00-98.75%). Gums 1-3 showed different colour-mixing characteristics as a function of chewing cycles, gum1 showed a logarithmic association; gum2 and gum3 demonstrated more linear behaviours. However, the number of chewing cycles could be predicted in all specimens from VOH (all p<0.0001, mixed linear regression models). Both analyses proved discriminative to the dental state. CONCLUSION ViewGum© proved to be a reliable and discriminative tool to opto-electronically assess chewing efficiency, given an elastic specimen is chewed for 20 cycles and could be recommended for the evaluation of chewing efficiency in a clinical and research setting. CLINICAL SIGNIFICANCE Chewing is a complex function of the oro-facial structures and the central nervous system. The application of the proposed assessments of the chewing function in geriatrics or special care dentistry could help visualising oro-functional or dental comorbidities in dysphagic patients or those suffering from protein-energy malnutrition.
Resumo:
When it comes to helping to shape sustainable development, research is most useful when it bridges the science–implementation/management gap and when it brings development specialists and researchers into a dialogue (Hurni et al. 2004); can a peer-reviewed journal contribute to this aim? In the classical system for validation and dissemination of scientific knowledge, journals focus on knowledge exchange within the academic community and do not specifically address a ‘life-world audience’. Within a North-South context, another knowledge divide is added: the peer review process excludes a large proportion of scientists from the South from participating in the production of scientific knowledge (Karlsson et al. 2007). Mountain Research and Development (MRD) is a journal whose mission is based on an editorial strategy to build the bridge between research and development and ensure that authors from the global South have access to knowledge production, ultimately with a view to supporting sustainable development in mountains. In doing so, MRD faces a number of challenges that we would like to discuss with the td-net community, after having presented our experience and strategy as editors of this journal. MRD was launched in 1981 by mountain researchers who wanted mountains to be included in the 1992 Rio process. In the late 1990s, MRD realized that the journal needed to go beyond addressing only the scientific community. It therefore launched a new section addressing a broader audience in 2000, with the aim of disseminating insights into, and recommendations for, the implementation of sustainable development in mountains. In 2006, we conducted a survey among MRD’s authors, reviewers, and readers (Wymann et al. 2007): respondents confirmed that MRD had succeeded in bridging the gap between research and development. But we realized that MRD could become an even more efficient tool for sustainability if development knowledge were validated: in 2009, we began submitting ‘development’ papers (‘transformation knowledge’) to external peer review of a kind different from the scientific-only peer review (for ‘systems knowledge’). At the same time, the journal became open access in order to increase the permeability between science and society, and ensure greater access for readers and authors in the South. We are currently rethinking our review process for development papers, with a view to creating more space for communication between science and society, and enhancing the co-production of knowledge (Roux 2008). Hopefully, these efforts will also contribute to the urgent debate on the ‘publication culture’ needed in transdisciplinary research (Kueffer et al. 2007).
Resumo:
Perennial snow and ice (PSI) extent is an important parameter of mountain environments with regard to its involvement in the hydrological cycle and the surface energy budget. We investigated interannual variations of PSI in nine mountain regions of interest (ROI) between 2000 and 2008. For that purpose, a novel MODIS data set processed at the Canada Centre for Remote Sensing at 250 m spatial resolution was utilized. The extent of PSI exhibited significant interannual variations, with coefficients of variation ranging from 5% to 81% depending on the ROI. A strong negative relationship was found between PSI and positive degree-days (threshold 0°C) during the summer months in most ROIs, with linear correlation coefficients (r) being as low as r = −0.90. In the European Alps and Scandinavia, PSI extent was significantly correlated with annual net glacier mass balances, with r = 0.91 and r = 0.85, respectively, suggesting that MODIS-derived PSI extent may be used as an indicator of net glacier mass balances. Validation of PSI extent in two land surface classifications for the years 2000 and 2005, GLC-2000 and Globcover, revealed significant discrepancies of up to 129% for both classifications. With regard to the importance of such classifications for land surface parameterizations in climate and land surface process models, this is a potential source of error to be investigated in future studies. The results presented here provide an interesting insight into variations of PSI in several ROIs and are instrumental for our understanding of sensitive mountain regions in the context of global climate change assessment.
Resumo:
he notion of outsourcing – making arrangements with an external entity for the provision of goods or services to supplement or replace internal efforts – has been around for centuries. The outsourcing of information systems (IS) is however a much newer concept but one which has been growing dramatically. This book attempts to synthesize what is known about IS outsourcing by dividing the subject into three interrelated parts: (1) Traditional Information Technology Outsourcing, (2) Information Technolgy Offshoring, and (3) Business Process Outsourcing. The book should be of interest to all academics and students in the field of Information Systems as well as corporate executives and professionals who seek a more profound analysis and understanding of the underlying factors and mechanisms of outsourcing.
Resumo:
Objective: Since 2011, the new national final examination in human medicine has been implemented in Switzerland, with a structured clinical-practical part in the OSCE format. From the perspective of the national Working Group, the current article describes the essential steps in the development, implementation and evaluation of the Federal Licensing Examination Clinical Skills (FLE CS) as well as the applied quality assurance measures. Finally, central insights gained from the last years are presented. Methods: Based on the principles of action research, the FLE CS is in a constant state of further development. On the foundation of systematically documented experiences from previous years, in the Working Group, unresolved questions are discussed and resulting solution approaches are substantiated (planning), implemented in the examination (implementation) and subsequently evaluated (reflection). The presented results are the product of this iterative procedure. Results: The FLE CS is created by experts from all faculties and subject areas in a multistage process. The examination is administered in German and French on a decentralised basis and consists of twelve interdisciplinary stations per candidate. As important quality assurance measures, the national Review Board (content validation) and the meetings of the standardised patient trainers (standardisation) have proven worthwhile. The statistical analyses show good measurement reliability and support the construct validity of the examination. Among the central insights of the past years, it has been established that the consistent implementation of the principles of action research contributes to the successful further development of the examination. Conclusion: The centrally coordinated, collaborative-iterative process, incorporating experts from all faculties, makes a fundamental contribution to the quality of the FLE CS. The processes and insights presented here can be useful for others planning a similar undertaking. Keywords: national final examination, licensing examination, summative assessment, OSCE, action research
Resumo:
A 55-year-old woman was referred because of diffuse pruritic erythematous lesions and an ischemic process of the third finger of her right hand. She was known to have anaemia secondary to hypermenorrhea. She presented six months before admission with a cutaneous infiltration on the left cubital cavity after a paravenous leakage of intravenous iron substitution. She then reported a progressive pruritic erythematous swelling of her left arm and lower extremities and trunk. Skin biopsy of a lesion on the right leg revealed a fibrillar, small-vessel vasculitis containing many eosinophils.Two months later she reported Raynaud symptoms in both hands, with a persistent violaceous coloration of the skin and cold sensation of her third digit of the right hand. A round 1.5 cm well-delimited swelling on the medial site of the left elbow was noted. The third digit of her right hand was cold and of violet colour. Eosinophilia (19 % of total leucocytes) was present. Doppler-duplex arterial examination of the upper extremities showed an occlusion of the cubital artery down to the palmar arcade on the right arm. Selective angiography of the right subclavian and brachial arteries showed diffuse alteration of the blood flow in the cubital artery and hand, with fine collateral circulation in the carpal region. Neither secondary causes of hypereosinophilia nor a myeloproliferative process was found. Considering the skin biopsy results and having excluded other causes of eosinophilia, we assumed the diagnosis of an eosinophilic vasculitis. Treatment with tacrolimus and high dose steroids was started, the latter tapered within 12 months and then stopped, but a dramatic flare-up of the vasculitis with Raynaud phenomenon occurred. A new immunosuppressive approach with steroids and methotrexate was then introduced. This case of aggressive eosinophilic vasculitis is difficult to classify into the usual forms of vasculitis and constitutes a therapeutic challenge given the resistance to current immunosuppressive regimens.
Resumo:
The present paper describes standardized procedures within clinical sleep medicine. As such, it is a continuation of the previously published European guidelines for the accreditation of sleep medicine centres and European guidelines for the certification of professionals in sleep medicine, aimed at creating standards of practice in European sleep medicine. It is also part of a broader action plan of the European Sleep Research Society, including the process of accreditation of sleep medicine centres and certification of sleep medicine experts, as well as publishing the Catalogue of Knowledge and Skills for sleep medicine experts (physicians, non-medical health care providers, nurses and technologists), which will be a basis for the development of relevant educational curricula. In the current paper, the standard operational procedures sleep medicine centres regarding the diagnostic and therapeutic management of patients evaluated at sleep medicine centres, accredited according to the European Guidelines, are based primarily on prevailing evidence-based medicine principles. In addition, parts of the standard operational procedures are based on a formalized consensus procedure applied by a group of Sleep Medicine Experts from the European National Sleep Societies. The final recommendations for standard operational procedures are categorized either as 'standard practice', 'procedure that could be useful', 'procedure that is not useful' or 'procedure with insufficient information available'. Standard operational procedures described here include both subjective and objective testing, as well as recommendations for follow-up visits and for ensuring patients' safety in sleep medicine. The overall goal of the actual standard operational procedures is to further develop excellence in the practice and quality assurance of sleep medicine in Europe.
Resumo:
The experimental verification of matrix diffusion in crystalline rocks largely relies on indirect methods performed in the laboratory. Such methods are prone to perturbations of the rock samples by collection and preparation and therefore the laboratory-derived transport properties and fluid composition might not represent in situ conditions. We investigated the effects induced by the drilling process and natural rock stress release by mass balance considerations and sensitivity analysis of analytical out-diffusion data obtained from originally saturated, large-sized drillcore material from two locations drilled using traced drilling fluid. For in situ stress-released drillcores of quartz-monzodiorite composition from the Aspo HRL, Sweden, tracer mass balance considerations and 1D and 2D diffusion modelling consistently indicated a contamination of <1% of the original pore water. This chemically disturbed zone extends to a maximum of 0.1 mm into the drillcore (61.8 mm x 180.1 mm) corresponding to about 0.66% of the total pore volume (0.77 vol.%). In contrast, the combined effects of stress release and the drilling process, which have influenced granodioritic drillcore material from 560 m below surface at Forsmark. Sweden, resulted in a maximum contamination of the derived porewater Cl(-) concentration of about 8%. The mechanically disturbed zone with modified diffusion properties covers the outermost similar to 6 mm of the drillcore (50 mm x 189 mm), whereas the chemically disturbed zone extends to a maximum of 0.3 mm based on mass balance considerations, and to 0.15 mm to 0.2 mm into the drillcore based on fitting the observed tracer data. This corresponds to a maximum of 2.4% of the total pore volume (0.62 vol.%) being affected by the drilling-fluid contamination. The proportion of rock volume affected initially by drilling fluid or subsequently with experiment water during the laboratory diffusion and re-saturation experiments depends on the size of the drillcore material and will become larger the smaller the sample used for the experiment. The results are further in support of matrix diffusion taking place in the undisturbed matrix of crystalline rocks at least in the cm range.
Resumo:
Recent developments in clinical radiology have resulted in additional developments in the field of forensic radiology. After implementation of cross-sectional radiology and optical surface documentation in forensic medicine, difficulties in the validation and analysis of the acquired data was experienced. To address this problem and for the comparison of autopsy and radiological data a centralized database with internet technology for forensic cases was created. The main goals of the database are (1) creation of a digital and standardized documentation tool for forensic-radiological and pathological findings; (2) establishing a basis for validation of forensic cross-sectional radiology as a non-invasive examination method in forensic medicine that means comparing and evaluating the radiological and autopsy data and analyzing the accuracy of such data; and (3) providing a conduit for continuing research and education in forensic medicine. Considering the infrequent availability of CT or MRI for forensic institutions and the heterogeneous nature of case material in forensic medicine an evaluation of benefits and limitations of cross-sectional imaging concerning certain forensic features by a single institution may be of limited value. A centralized database permitting international forensic and cross disciplinary collaborations may provide important support for forensic-radiological casework and research.
Resumo:
BACKGROUND: Complete investigation of thrombophilic or hemorrhagic clinical presentations is a time-, apparatus-, and cost-intensive process. Sensitive screening tests for characterizing the overall function of the hemostatic system, or defined parts of it, would be very useful. For this purpose, we are developing an electrochemical biosensor system that allows measurement of thrombin generation in whole blood as well as in plasma. METHODS: The measuring system consists of a single-use electrochemical sensor in the shape of a strip and a measuring unit connected to a personal computer, recording the electrical signal. Blood is added to a specific reagent mixture immobilized in dry form on the strip, including a coagulation activator (e.g., tissue factor or silica) and an electrogenic substrate specific to thrombin. RESULTS: Increasing thrombin concentrations gave standard curves with progressively increasing maximal current and decreasing time to reach the peak. Because the measurement was unaffected by color or turbidity, any type of blood sample could be analyzed: platelet-poor plasma, platelet-rich plasma, and whole blood. The test strips with the predried reagents were stable when stored for several months before testing. Analysis of the combined results obtained with different activators allowed discrimination between defects of the extrinsic, intrinsic, and common coagulation pathways. Activated protein C (APC) predried on the strips allowed identification of APC-resistance in plasma and whole blood samples. CONCLUSIONS: The biosensor system provides a new method for assessing thrombin generation in plasma or whole blood samples as small as 10 microL. The assay is easy to use, thus allowing it to be performed in a point-of-care setting.
Resumo:
BACKGROUND: Microarray genome analysis is realising its promise for improving detection of genetic abnormalities in individuals with mental retardation and congenital abnormality. Copy number variations (CNVs) are now readily detectable using a variety of platforms and a major challenge is the distinction of pathogenic from ubiquitous, benign polymorphic CNVs. The aim of this study was to investigate replacement of time consuming, locus specific testing for specific microdeletion and microduplication syndromes with microarray analysis, which theoretically should detect all known syndromes with CNV aetiologies as well as new ones. METHODS: Genome wide copy number analysis was performed on 117 patients using Affymetrix 250K microarrays. RESULTS: 434 CNVs (195 losses and 239 gains) were found, including 18 pathogenic CNVs and 9 identified as "potentially pathogenic". Almost all pathogenic CNVs were larger than 500 kb, significantly larger than the median size of all CNVs detected. Segmental regions of loss of heterozygosity larger than 5 Mb were found in 5 patients. CONCLUSIONS: Genome microarray analysis has improved diagnostic success in this group of patients. Several examples of recently discovered "new syndromes" were found suggesting they are more common than previously suspected and collectively are likely to be a major cause of mental retardation. The findings have several implications for clinical practice. The study revealed the potential to make genetic diagnoses that were not evident in the clinical presentation, with implications for pretest counselling and the consent process. The importance of contributing novel CNVs to high quality databases for genotype-phenotype analysis and review of guidelines for selection of individuals for microarray analysis is emphasised.
Resumo:
OBJECTIVE: In ictal scalp electroencephalogram (EEG) the presence of artefacts and the wide ranging patterns of discharges are hurdles to good diagnostic accuracy. Quantitative EEG aids the lateralization and/or localization process of epileptiform activity. METHODS: Twelve patients achieving Engel Class I/IIa outcome following temporal lobe surgery (1 year) were selected with approximately 1-3 ictal EEGs analyzed/patient. The EEG signals were denoised with discrete wavelet transform (DWT), followed by computing the normalized absolute slopes and spatial interpolation of scalp topography associated to detection of local maxima. For localization, the region with the highest normalized absolute slopes at the time when epileptiform activities were registered (>2.5 times standard deviation) was designated as the region of onset. For lateralization, the cerebral hemisphere registering the first appearance of normalized absolute slopes >2.5 times the standard deviation was designated as the side of onset. As comparison, all the EEG episodes were reviewed by two neurologists blinded to clinical information to determine the localization and lateralization of seizure onset by visual analysis. RESULTS: 16/25 seizures (64%) were correctly localized by the visual method and 21/25 seizures (84%) by the quantitative EEG method. 12/25 seizures (48%) were correctly lateralized by the visual method and 23/25 seizures (92%) by the quantitative EEG method. The McNemar test showed p=0.15 for localization and p=0.0026 for lateralization when comparing the two methods. CONCLUSIONS: The quantitative EEG method yielded significantly more seizure episodes that were correctly lateralized and there was a trend towards more correctly localized seizures. SIGNIFICANCE: Coupling DWT with the absolute slope method helps clinicians achieve a better EEG diagnostic accuracy.
Resumo:
By means of fixed-links modeling the present study assessed processes involved in visual short-term memory functioning and investigates how these processes are related to intelligence. Using a color change detection task, short-term memory demands increased across three experimental conditions as a function of number of presented stimuli. We measured amount of information retained in visual short-term memory by hit rate as well as speed of visual short-term memory scanning by reaction time. For both measures, fixed-links modeling revealed a constant process reflecting processes irrespective of task manipulation as well as two increasing processes reflecting the increasing short-term memory demands. For visual short-term memory scanning, a negative association between intelligence and the constant process was found but no relationship between intelligence and the increasing processes. Thus, basic processing speed, rather than speed influenced by visual short-term memory demands, differentiates between high- and low-intelligent individuals. Intelligence was positively related to the experimental processes of shortterm memory retention but not to the constant process. In sum, significant associations with intelligence were only obtained when the specific processes of short-term memory were decomposed emphasizing the importance of a thorough assessment of cognitive processes when investigating their relation to intelligence.