925 resultados para SYSTEMATIC-ERROR CORRECTION
Resumo:
This paper analyses the correction of errors and mistakes made by students in the Foreign Language Teaching classroom. Its goal is to point out typical correction behaviors in Cape Verde in Language Teaching classrooms and raise teachers’ consciousness concerning better correction practice.
Resumo:
BACKGROUND: Teaching of evidence-based medicine (EBM) has become widespread in medical education. Teaching the teachers (TTT) courses address the increased teaching demand and the need to improve effectiveness of EBM teaching. We conducted a systematic review of assessment tools for EBM TTT courses.To summarise and appraise existing assessment methods for teaching the teachers courses in EBM by a systematic review. METHODS: We searched PubMed, BioMed, EmBase, Cochrane and Eric databases without language restrictions and included articles that assessed its participants. Study selection and data extraction were conducted independently by two reviewers. RESULTS: Of 1230 potentially relevant studies, five papers met the selection criteria. There were no specific assessment tools for evaluating effectiveness of EBM TTT courses. Some of the material available might be useful in initiating the development of such an assessment tool. CONCLUSION: There is a need for the development of educationally sound assessment tools for teaching the teachers courses in EBM, without which it would be impossible to ascertain if such courses have the desired effect.
Resumo:
OBJECTIVE: In order to improve the quality of our Emergency Medical Services (EMS), to raise bystander cardiopulmonary resuscitation rates and thereby meet what is becoming a universal standard in terms of quality of emergency services, we decided to implement systematic dispatcher-assisted or telephone-CPR (T-CPR) in our medical dispatch center, a non-Advanced Medical Priority Dispatch System. The aim of this article is to describe the implementation process, costs and results following the introduction of this new "quality" procedure. METHODS: This was a prospective study. Over an 8-week period, our EMS dispatchers were given new procedures to provide T-CPR. We then collected data on all non-traumatic cardiac arrests within our state (Vaud, Switzerland) for the following 12months. For each event, the dispatchers had to record in writing the reason they either ruled out cardiac arrest (CA) or did not propose T-CPR in the event they did suspect CA. All emergency call recordings were reviewed by the medical director of the EMS. The analysis of the recordings and the dispatchers' written explanations were then compared. RESULTS: During the 12-month study period, a total of 497 patients (both adults and children) were identified as having a non-traumatic cardiac arrest. Out of this total, 203 cases were excluded and 294 cases were eligible for T-CPR. Out of these eligible cases, dispatchers proposed T-CPR on 202 occasions (or 69% of eligible cases). They also erroneously proposed T-CPR on 17 occasions when a CA was wrongly identified (false positive). This represents 7.8% of all T-CPR. No costs were incurred to implement our study protocol and procedures. CONCLUSIONS: This study demonstrates it is possible, using a brief campaign of sensitization but without any specific training, to implement systematic dispatcher-assisted cardiopulmonary resuscitation in a non-Advanced Medical Priority Dispatch System such as our EMS that had no prior experience with systematic T-CPR. The results in terms of T-CPR delivery rate and false positive are similar to those found in previous studies. We found our results satisfying the given short time frame of this study. Our results demonstrate that it is possible to improve the quality of emergency services at moderate or even no additional costs and this should be of interest to all EMS that do not presently benefit from using T-CPR procedures. EMS that currently do not offer T-CPR should consider implementing this technique as soon as possible, and we expect our experience may provide answers to those planning to incorporate T-CPR in their daily practice.
Resumo:
Background a nd A ims: I nfliximab (IFX), adalimumab (ADA)and certolizumab pegol (CZP) have similar efficacy for inductionand maintenance of clinical response and remission in Crohn'sdisease (CD). Given the comparable nature of t hese drugs,patients' p references m ay i nfluence the choice o f the product.Goal: to identify factors contributing to CD patients' decision inselecting one anti-TNF agent over the others.Methods: A p rospectdive s urvey was performed a mong a nti-TNF-naïve CD patients. Prior to completion of a questionnaire,patients were provided with a description of the three anti-TNFagents f ocusing on indications, route of administration, s ideeffects, and scientific evidence of efficacy and safety.Results: One hundred patients (47f/53m, mean age 45±16yrs)completed the questionnaire. Disease location was ileal, colonicand ileocolonic in 33%, 40% and 27% of patients, respectively.Thirty-six percent preferred ADA as medication of choice, while28% and 2 5% p referred CZP and IFX; 11% were u ndecided.Patients' decision in selecting an anti-TNF drug was influencedby t he following f actors: side effects ( 76%), p hysician'srecommendation (66%), route of administration (54%), efficacydata (52%), time required for therapy administration (27%),recommendations by other CD patients (21%) and interactionswith other medications (12%).Conclusions: T he majority of p atients p referred anti-TNFmedications t hat were a dministered by s ubcutaneous i njectionrather t han b y intravenous i nfusion. Side effect profile andphysicians' r ecommendation are t wo m ajor factors influencingthe patients' s election of a specific anti-TNF d rug. Patients'concerns about safety and lifestyle habits should be taken intoaccount when prescribing anti-TNF drugs.
Resumo:
We study model selection strategies based on penalized empirical loss minimization. We point out a tight relationship between error estimation and data-based complexity penalization: any good error estimate may be converted into a data-based penalty function and the performance of the estimate is governed by the quality of the error estimate. We consider several penalty functions, involving error estimates on independent test data, empirical {\sc vc} dimension, empirical {\sc vc} entropy, andmargin-based quantities. We also consider the maximal difference between the error on the first half of the training data and the second half, and the expected maximal discrepancy, a closely related capacity estimate that can be calculated by Monte Carlo integration. Maximal discrepancy penalty functions are appealing for pattern classification problems, since their computation is equivalent to empirical risk minimization over the training data with some labels flipped.
Resumo:
Summary points: - The bias introduced by random measurement error will be different depending on whether the error is in an exposure variable (risk factor) or outcome variable (disease) - Random measurement error in an exposure variable will bias the estimates of regression slope coefficients towards the null - Random measurement error in an outcome variable will instead increase the standard error of the estimates and widen the corresponding confidence intervals, making results less likely to be statistically significant - Increasing sample size will help minimise the impact of measurement error in an outcome variable but will only make estimates more precisely wrong when the error is in an exposure variable
Resumo:
Remote sensing spatial, spectral, and temporal resolutions of images, acquired over a reasonably sized image extent, result in imagery that can be processed to represent land cover over large areas with an amount of spatial detail that is very attractive for monitoring, management, and scienti c activities. With Moore's Law alive and well, more and more parallelism is introduced into all computing platforms, at all levels of integration and programming to achieve higher performance and energy e ciency. Being the geometric calibration process one of the most time consuming processes when using remote sensing images, the aim of this work is to accelerate this process by taking advantage of new computing architectures and technologies, specially focusing in exploiting computation over shared memory multi-threading hardware. A parallel implementation of the most time consuming process in the remote sensing geometric correction has been implemented using OpenMP directives. This work compares the performance of the original serial binary versus the parallelized implementation, using several multi-threaded modern CPU architectures, discussing about the approach to nd the optimum hardware for a cost-e ective execution.