46 resultados para software quality assurance


Relevância:

80.00% 80.00%

Publicador:

Resumo:

OBJECTIVE We sought to evaluate potential reasons given by board-certified doctors for the persistence of adverse events despite efforts to improve patient safety in Switzerland. SUMMARY BACKGROUND DATA In recent years, substantial efforts have been made to improve patient safety by introducing surgical safety checklists to standardise surgeries and team procedures. Still, a high number of adverse events remain. METHODS Clinic directors in operative medicine in Switzerland were asked to answer two questions concerning the reasons for persistence of adverse events, and the advantages and disadvantages of introducing and implementing surgical safety checklists. Of 799 clinic directors, the arguments of 237 (29.7%) were content-analysed using Mayring's content analysis method, resulting in 12 different categories. RESULTS Potential reasons for the persistence of adverse events were mainly seen as being related to the "individual" (126/237, 53.2%), but directors of high-volume clinics identified factors related to the "group and interactions" significantly more often as a reason (60.2% vs 40.2%; p = 0.003). Surgical safety checklists were thought to have positive effects on the "organisational level" (47/237, 19.8%), the "team level" (37/237, 15.6%) and the "patient level" (40/237, 16.9%), with a "lack of willingness to implement checklists" as the main disadvantage (34/237, 14.3%). CONCLUSION This qualitative study revealed the individual as the main player in the persistence of adverse events. Working conditions should be optimised to minimise interface problems in the case of cross-covering of patients, to assure support for students, residents and interns, and to reduce strain. Checklists are helpful on an "organisational level" (e.g., financial benefits, quality assurance) and to clarify responsibilities.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Vector control is the mainstay of malaria control programmes. Successful vector control profoundly relies on accurate information on the target mosquito populations in order to choose the most appropriate intervention for a given mosquito species and to monitor its impact. An impediment to identify mosquito species is the existence of morphologically identical sibling species that play different roles in the transmission of pathogens and parasites. Currently PCR diagnostics are used to distinguish between sibling species. PCR based methods are, however, expensive, time-consuming and their development requires a priori DNA sequence information. Here, we evaluated an inexpensive molecular proteomics approach for Anopheles species: matrix assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF MS). MALDI-TOF MS is a well developed protein profiling tool for the identification of microorganisms but so far has received little attention as a diagnostic tool in entomology. We measured MS spectra from specimens of 32 laboratory colonies and 2 field populations representing 12 Anopheles species including the A. gambiae species complex. An important step in the study was the advancement and implementation of a bioinformatics approach improving the resolution over previously applied cluster analysis. Borrowing tools for linear discriminant analysis from genomics, MALDI-TOF MS accurately identified taxonomically closely related mosquito species, including the separation between the M and S molecular forms of A. gambiae sensu stricto. The approach also classifies specimens from different laboratory colonies; hence proving also very promising for its use in colony authentication as part of quality assurance in laboratory studies. While being exceptionally accurate and robust, MALDI-TOF MS has several advantages over other typing methods, including simple sample preparation and short processing time. As the method does not require DNA sequence information, data can also be reviewed at any later stage for diagnostic or functional patterns without the need for re-designing and re-processing biological material.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

High-resolution capillary zone electrophoresis in the routine arena with stringent quality assurance is employed for the determination of carbohydrate-deficient transferrin in human serum. The assay comprises mixing of human serum with a Fe(III) -containing solution prior to analysis of the iron-saturated mixture in a dynamically double-coated capillary using a commercial buffer at alkaline pH. In contrast to other assays, it provides sufficient resolution for proper recognition of genetic transferrin variants. Analysis of 7290 patient sera revealed 166 isoform patterns that could be assigned to genetic variants, namely, 109 BC, 53 CD, one BD and three CC variants. Several subtypes of transferrin D can be distinguished as they have large enough differences in pI values. Subtypes of transferrin C and B cannot be resolved. However, analysis of the detection time ratios of tetrasialo isoforms of transferrin BC and transferrin CD variants revealed multimodal frequency histograms, indicating the presence of subtypes of transferrin C, B and D. The data gathered over 11 years demonstrate the robustness of the high-resolution capillary zone electrophoresis assay. This is the first account of a capillary zone electrophoresis based carbohydrate-deficient transferrin assay with a broad overview on transferrin isoform patterns associated with genetic transferrin variants.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Detecting bugs as early as possible plays an important role in ensuring software quality before shipping. We argue that mining previous bug fixes can produce good knowledge about why bugs happen and how they are fixed. In this paper, we mine the change history of 717 open source projects to extract bug-fix patterns. We also manually inspect many of the bugs we found to get insights into the contexts and reasons behind those bugs. For instance, we found out that missing null checks and missing initializations are very recurrent and we believe that they can be automatically detected and fixed.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Objective: Since 2011, the new national final examination in human medicine has been implemented in Switzerland, with a structured clinical-practical part in the OSCE format. From the perspective of the national Working Group, the current article describes the essential steps in the development, implementation and evaluation of the Federal Licensing Examination Clinical Skills (FLE CS) as well as the applied quality assurance measures. Finally, central insights gained from the last years are presented. Methods: Based on the principles of action research, the FLE CS is in a constant state of further development. On the foundation of systematically documented experiences from previous years, in the Working Group, unresolved questions are discussed and resulting solution approaches are substantiated (planning), implemented in the examination (implementation) and subsequently evaluated (reflection). The presented results are the product of this iterative procedure. Results: The FLE CS is created by experts from all faculties and subject areas in a multistage process. The examination is administered in German and French on a decentralised basis and consists of twelve interdisciplinary stations per candidate. As important quality assurance measures, the national Review Board (content validation) and the meetings of the standardised patient trainers (standardisation) have proven worthwhile. The statistical analyses show good measurement reliability and support the construct validity of the examination. Among the central insights of the past years, it has been established that the consistent implementation of the principles of action research contributes to the successful further development of the examination. Conclusion: The centrally coordinated, collaborative-iterative process, incorporating experts from all faculties, makes a fundamental contribution to the quality of the FLE CS. The processes and insights presented here can be useful for others planning a similar undertaking. Keywords: national final examination, licensing examination, summative assessment, OSCE, action research

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Measurements of 14C in the organic carbon (OC) and elemental carbon (EC) fractions, respectively, of fine aerosol particles bear the potential to apportion anthropogenic and biogenic emission sources. For this purpose, the system THEODORE (two-step heating system for the EC/OC determination of radiocarbon in the environment) was developed. In this device, OC and EC are transformed into carbon dioxide in a stream of oxygen at 340 and 650 �C, respectively, and reduced to filamentous carbon. This is the target material for subsequent accelerator mass spectrometry (AMS) 14C measurements, which were performed on sub-milligram carbon samples at the PSI/ETH compact 500 kV AMS system. Quality assurance measurements of SRM 1649a, Urban Dust, yielded a fraction of modern fM in total carbon (TC) of 0.522 ±0.018 (n ¼ 5, 95% confidence level) in agreement with reported values. The results for OC and EC are 0.70± 0.05 (n ¼ 3) and 0.066 ± 0.020 (n ¼ 4), respectively.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Workshop Overview The use of special effects (moulage) is a way to augment the authenticity of a scenario in simulation. This workshop will introduce different techniques of moulage (oil based cream colors, watercolors, transfer tattoos and 3D Prosthetics). The participants will have the opportunity to explore these techniques by applying various moulages. They will compare the techniques and discuss their advantages and disadvantages. Moreover, strategies for standardization and quality assurance will be discussed. Workshop Rationale Moulage supports the sensory perception in an scenario (1). It can provide evaluation clues (2) and help learners (and SPs) to engage in the simulation. However, it is of crucial importance that the simulated physical pathologies are represented accurate and reliable. Accuracy is achieved by using the appropriate technique, which requires knowledge and practice . With information about different moulage techniques, we hope to increases the knowledge of moulage during the workshop. By applying moulages in various techniques we will practice together. As standardization is critical for simulation scenarios in assessment (3, 4) strategies for standardization of moulage will be introduced and discussed. Workshop Objectives During the workshop participants will: - gain knowledge about different techniques of moulages - practice moulages in various techniques - discuss the advantages and disadvantages of moulage techniques - describe strategies for standardization and quality assurance of moulage Planned Format 5 min Introduction 15 min Overview – Background & Theory (presentation) 15 min Application of moulage for ankle sprain in 4 different techniques (oil based cream color, water color, temporary tatoo, 3D prosthetic) in small groups 5 min Comparing the results by interactive viewing of prepared moulages 15 min Application of moulages for burn in different techniques in small groups 5 min Comparing results the results by interactive viewing of prepared moulages 5 min Sharing experiences with different techniques in small groups 20 min Discussion of the techniques including standardization and quality assurance strategies (plenary discussion) 5 min Summary / Take home points

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Software architecture is the result of a design effort aimed at ensuring a certain set of quality attributes. As we show, quality requirements are commonly specified in practice but are rarely validated using automated techniques. In this paper we analyze and classify commonly specified quality requirements after interviewing professionals and running a survey. We report on tools used to validate those requirements and comment on the obstacles encountered by practitioners when performing such activity (e.g., insufficient tool-support; poor understanding of users needs). Finally we discuss opportunities for increasing the adoption of automated tools based on the information we collected during our study (e.g., using a business-readable notation for expressing quality requirements; increasing awareness by monitoring non-functional aspects of a system).

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Quality data are not only relevant for successful Data Warehousing or Business Intelligence applications; they are also a precondition for efficient and effective use of Enterprise Resource Planning (ERP) systems. ERP professionals in all kinds of businesses are concerned with data quality issues, as a survey, conducted by the Institute of Information Systems at the University of Bern, has shown. This paper demonstrates, by using results of this survey, why data quality problems in modern ERP systems can occur and suggests how ERP researchers and practitioners can handle issues around the quality of data in an ERP software Environment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The effect of copper (Cu) filtration on image quality and dose in different digital X-ray systems was investigated. Two computed radiography systems and one digital radiography detector were used. Three different polymethylmethacrylate blocks simulated the pediatric body. The effect of Cu filters of 0.1, 0.2, and 0.3 mm thickness on the entrance surface dose (ESD) and the corresponding effective doses (EDs) were measured at tube voltages of 60, 66, and 73 kV. Image quality was evaluated in a contrast-detail phantom with an automated analyzer software. Cu filters of 0.1, 0.2, and 0.3 mm thickness decreased the ESD by 25-32%, 32-39%, and 40-44%, respectively, the ranges depending on the respective tube voltages. There was no consistent decline in image quality due to increasing Cu filtration. The estimated ED of anterior-posterior (AP) chest projections was reduced by up to 23%. No relevant reduction in the ED was noted in AP radiographs of the abdomen and pelvis or in posterior-anterior radiographs of the chest. Cu filtration reduces the ESD, but generally does not reduce the effective dose. Cu filters can help protect radiosensitive superficial organs, such as the mammary glands in AP chest projections.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE: Computer-based feedback systems for assessing the quality of cardiopulmonary resuscitation (CPR) are widely used these days. Recordings usually involve compression and ventilation dependent variables. Thorax compression depth, sufficient decompression and correct hand position are displayed but interpreted independently of one another. We aimed to generate a parameter, which represents all the combined relevant parameters of compression to provide a rapid assessment of the quality of chest compression-the effective compression ratio (ECR). METHODS: The following parameters were used to determine the ECR: compression depth, correct hand position, correct decompression and the proportion of time used for chest compressions compared to the total time spent on CPR. Based on the ERC guidelines, we calculated that guideline compliant CPR (30:2) has a minimum ECR of 0.79. To calculate the ECR, we expanded the previously described software solution. In order to demonstrate the usefulness of the new ECR-parameter, we first performed a PubMed search for studies that included correct compression and no-flow time, after which we calculated the new parameter, the ECR. RESULTS: The PubMed search revealed 9 trials. Calculated ECR values ranged between 0.03 (for basic life support [BLS] study, two helpers, no feedback) and 0.67 (BLS with feedback from the 6th minute). CONCLUSION: ECR enables rapid, meaningful assessment of CPR and simplifies the comparability of studies as well as the individual performance of trainees. The structure of the software solution allows it to be easily adapted to any manikin, CPR feedback devices and different resuscitation guidelines (e.g. ILCOR, ERC).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: To describe the electronic medical databases used in antiretroviral therapy (ART) programmes in lower-income countries and assess the measures such programmes employ to maintain and improve data quality and reduce the loss of patients to follow-up. METHODS: In 15 countries of Africa, South America and Asia, a survey was conducted from December 2006 to February 2007 on the use of electronic medical record systems in ART programmes. Patients enrolled in the sites at the time of the survey but not seen during the previous 12 months were considered lost to follow-up. The quality of the data was assessed by computing the percentage of missing key variables (age, sex, clinical stage of HIV infection, CD4+ lymphocyte count and year of ART initiation). Associations between site characteristics (such as number of staff members dedicated to data management), measures to reduce loss to follow-up (such as the presence of staff dedicated to tracing patients) and data quality and loss to follow-up were analysed using multivariate logit models. FINDINGS: Twenty-one sites that together provided ART to 50 060 patients were included (median number of patients per site: 1000; interquartile range, IQR: 72-19 320). Eighteen sites (86%) used an electronic database for medical record-keeping; 15 (83%) such sites relied on software intended for personal or small business use. The median percentage of missing data for key variables per site was 10.9% (IQR: 2.0-18.9%) and declined with training in data management (odds ratio, OR: 0.58; 95% confidence interval, CI: 0.37-0.90) and weekly hours spent by a clerk on the database per 100 patients on ART (OR: 0.95; 95% CI: 0.90-0.99). About 10 weekly hours per 100 patients on ART were required to reduce missing data for key variables to below 10%. The median percentage of patients lost to follow-up 1 year after starting ART was 8.5% (IQR: 4.2-19.7%). Strategies to reduce loss to follow-up included outreach teams, community-based organizations and checking death registry data. Implementation of all three strategies substantially reduced losses to follow-up (OR: 0.17; 95% CI: 0.15-0.20). CONCLUSION: The quality of the data collected and the retention of patients in ART treatment programmes are unsatisfactory for many sites involved in the scale-up of ART in resource-limited settings, mainly because of insufficient staff trained to manage data and trace patients lost to follow-up.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

INTRODUCTION: A multi-centre study has been conducted, during 2005, by means of a questionnaire posted on the Italian Society of Emergency Medicine (SIMEU) web page. Our intention was to carry out an organisational and functional analysis of Italian Emergency Departments (ED) in order to pick out some macro-indicators of the activities performed. Participation was good, in that 69 ED (3,285,440 admissions to emergency services) responded to the questionnaire. METHODS: The study was based on 18 questions: 3 regarding the personnel of the ED, 2 regarding organisational and functional aspects, 5 on the activity of the ED, 7 on triage and 1 on the assessment of the quality perceived by the users of the ED. RESULTS AND CONCLUSION: The replies revealed that 91.30% of the ED were equipped with data-processing software, which, in 96.83% of cases, tracked the entire itinerary of the patient. About 48,000 patients/year used the ED: 76.72% were discharged and 18.31% were hospitalised. Observation Units were active in 81.16% of the ED examined. Triage programmes were in place in 92.75% of ED: in 75.81% of these, triage was performed throughout the entire itinerary of the patient; in 16.13% it was performed only symptom-based, and in 8.06% only on-call. Of the patients arriving at the ED, 24.19% were assigned a non-urgent triage code, 60.01% a urgent code, 14.30% a emergent code and 1.49% a life-threatening code. Waiting times were: 52.39 min for non-urgent patients, 40.26 min for urgent, 12.08 for emergent, and 1.19 for life-threatening patients.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

When project managers determine schedules for resource-constrained projects, they commonly use commercial project management software packages. Which resource-allocation methods are implemented in these packages is proprietary information. The resource-allocation problem is in general computationally difficult to solve to optimality. Hence, the question arises if and how various project management software packages differ in quality with respect to their resource-allocation capabilities. None of the few existing papers on this subject uses a sizeable data set and recent versions of common software packages. We experimentally analyze the resource-allocation capabilities of Acos Plus.1, AdeptTracker Professional, CS Project Professional, Microsoft Office Project 2007, Primavera P6, Sciforma PS8, and Turbo Project Professional. Our analysis is based on 1560 instances of the precedence- and resource-constrained project scheduling problem RCPSP. The experiment shows that using the resource-allocation feature of these packages may lead to a project duration increase of almost 115% above the best known feasible schedule. The increase gets larger with increasing resource scarcity and with increasing number of activities. We investigate the impact of different complexity scenarios and priority rules on the project duration obtained by the software packages. We provide a decision table to support managers in selecting a software package and a priority rule.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper is a summary of the main contribu- tions of the PhD thesis published in [1]. The main research contributions of the thesis are driven by the research question how to design simple, yet efficient and robust run-time adaptive resource allocation schemes within the commu- nication stack of Wireless Sensor Network (WSN) nodes. The thesis addresses several problem domains with con- tributions on different layers of the WSN communication stack. The main contributions can be summarized as follows: First, a a novel run-time adaptive MAC protocol is intro- duced, which stepwise allocates the power-hungry radio interface in an on-demand manner when the encountered traffic load requires it. Second, the thesis outlines a metho- dology for robust, reliable and accurate software-based energy-estimation, which is calculated at network run- time on the sensor node itself. Third, the thesis evaluates several Forward Error Correction (FEC) strategies to adap- tively allocate the correctional power of Error Correcting Codes (ECCs) to cope with timely and spatially variable bit error rates. Fourth, in the context of TCP-based communi- cations in WSNs, the thesis evaluates distributed caching and local retransmission strategies to overcome the perfor- mance degrading effects of packet corruption and trans- mission failures when transmitting data over multiple hops. The performance of all developed protocols are eval- uated on a self-developed real-world WSN testbed and achieve superior performance over selected existing ap- proaches, especially where traffic load and channel condi- tions are suspect to rapid variations over time.