49 resultados para Quality evaluation and certification
Resumo:
The shape of the energy spectrum produced by an x-ray tube has a great importance in mammography. Many anode-filtration combinations have been proposed to obtain the most effective spectrum shape for the image quality-dose relationship. On the other hand, third generation synchrotrons such as the European Synchrotron Radiation Facility in Grenoble are able to produce a high flux of monoenergetic radiation. It is thus a powerful tool to study the effect of beam energy on image quality and dose in mammography. An objective method was used to evaluate image quality and dose in mammography with synchrotron radiation and to compare them to standard conventional units. It was performed systematically in the energy range of interest for mammography through the evaluation of a global image quality index and through the measurement of the mean glandular dose. Compared to conventional mammography units, synchrotron radiation shows a great improvement of the image quality-dose relationship, which is due to the beam monochromaticity and to the high intrinsic collimation of the beam, which allows the use of a slit instead of an anti-scatter grid for scatter rejection.
Resumo:
The important weight loss due to bariatric surgery allows to improve and even correct, a great part of the comorbidities induced by obesity, as well as quality of life, and to reduce the coming out of cardiovascular and metabolic diseases in operated patients. The impact of surgical treatment on the patient's health and quality of life also allows to reduce direct and indirect costs of morbid obesity. However, its effects on mortality have not yet been clearly proved. The preoperative evaluation and the long term follow-up by a skilled team are essential to reduce potential complications, especially on the nutritional field and the risks of recovered weight linked to binge eating disorders.
Resumo:
Computed tomography (CT) is a modality of choice for the study of the musculoskeletal system for various indications including the study of bone, calcifications, internal derangements of joints (with CT arthrography), as well as periprosthetic complications. However, CT remains intrinsically limited by the fact that it exposes patients to ionizing radiation. Scanning protocols need to be optimized to achieve diagnostic image quality at the lowest radiation dose possible. In this optimization process, the radiologist needs to be familiar with the parameters used to quantify radiation dose and image quality. CT imaging of the musculoskeletal system has certain specificities including the focus on high-contrast objects (i.e., in CT of bone or CT arthrography). These characteristics need to be taken into account when defining a strategy to optimize dose and when choosing the best combination of scanning parameters. In the first part of this review, we present the parameters used for the evaluation and quantification of radiation dose and image quality. In the second part, we discuss different strategies to optimize radiation dose and image quality at CT, with a focus on the musculoskeletal system and the use of novel iterative reconstruction techniques.
Resumo:
Computed tomography (CT) is a modality of choice for the study of the musculoskeletal system for various indications including the study of bone, calcifications, internal derangements of joints (with CT arthrography), as well as periprosthetic complications. However, CT remains intrinsically limited by the fact that it exposes patients to ionizing radiation. Scanning protocols need to be optimized to achieve diagnostic image quality at the lowest radiation dose possible. In this optimization process, the radiologist needs to be familiar with the parameters used to quantify radiation dose and image quality. CT imaging of the musculoskeletal system has certain specificities including the focus on high-contrast objects (i.e., in CT of bone or CT arthrography). These characteristics need to be taken into account when defining a strategy to optimize dose and when choosing the best combination of scanning parameters. In the first part of this review, we present the parameters used for the evaluation and quantification of radiation dose and image quality. In the second part, we discuss different strategies to optimize radiation dose and image quality of CT, with a focus on the musculoskeletal system and the use of novel iterative reconstruction techniques.
Resumo:
BACKGROUND:The Swiss breast cancer screening pilot programme was conducted in 3 districts of theFrench-speaking canton of Vaud (ca. 300,000 resident women) between October 1993 and January 1999.Women aged 50 to 69 were invited by mail every 2 years for a free of charge screening mammography (doubleview, multiple reading). This first ever-organised cancer screening programme in Switzerland showed thefeasibility and acceptability of this kind of public health intervention in the liberal Swiss healthcare system, whichwas the main objective of the pilot programme. This mammographic screening programme was extended to thewhole canton in 1999, and contributed to the implementation of similar programmes in 2 neighbouring cantons. OBJECTIVE:To appraise the use, the quality and the effectiveness of the Swiss screening pilot programme. METHODS:About 15,000 women (aged 50-69) were enrolled. Logistic regression analyses were performedseparately to identify determinants of initial and subsequent attendance. Standard indicators of quality,effectiveness and impact of the programme were assessed and compared with European recommendations. Tothis intent, linkage with data from the Vaud Cancer Registry was performed. RESULTS:About half the target population was screened at least once during the pilot trial. Participation washigher among Swiss than foreigners, among widowed or married women than among single, divorced or separatedones. Attendance also increased with age and decreasing distance between residence and the dedicatedscreening centre. Apart from Swiss citizenship, socio-demographic factors were not associated with reattendance.Intensity of prior recruitment, outcome of previous screening test (positive vs. negative) and indicators of women'shealth behaviour (time of last mammography prior to initial screen, smoking status) were the main determinants ofreattendance. Programme performance and quality indicators were, overall, in line with European Guidelines. Theywere overall more favourable among 60-69 than 50-59 year-olds and improved over time. CONCLUSION:The objectives of the pilot programme were met. Even if participation should increase in order toreach European standards, performance indicators overall met quality requirements. Ways to improve screeninguse, quality and effectiveness were devised and taken into account for the generalisation of the programme.
Resumo:
BACKGROUND: HSV-1 and HSV-2 cause CNS infections of dissimilar clinico-pathological characteristics with prognostic and therapeutic implications. OBJECTIVES: To validate a type-specific real-time PCR that uses MGB/LNA Taqman probes and to review the virologico-clinical data of 25 eligible patients with non-neonatal CNS infections. RESULTS: This real-time PCR was evaluated against conventional PCR (26 CSF and 20 quality controls), and LightCycler assay (51 mucocutaneous, 8 CSF and 32 quality controls) and culture/immunofluorescence (75 mucocutaneous) to assess typing with independent methods. Taqman real-time PCR detected 240 HSV genomes per ml CSF, a level appropriate for the management of patients, and provided unambiguous typing for the 104 positive (62 HSV-1 and 42 HSV-2) out the 160 independent clinical samples tested. HSV type diagnosed by Taqman real-time PCR predicted final diagnosis (meningitis versus encephalitis/meningoencephalitis, p<0.001) in 24/25 patients at time of presentation, in contrast to clinical evaluation. CONCLUSIONS: Our real-time PCR, as a sensitive and specific means for type-specific HSV diagnosis, provided rapid prognostic information for patient management.
Resumo:
ABSTRACT: BACKGROUND: Cardiovascular magnetic resonance (CMR) has favorable characteristics for diagnostic evaluation and risk stratification of patients with known or suspected CAD. CMR utilization in CAD detection is growing fast. However, data on its cost-effectiveness are scarce. The goal of this study is to compare the costs of two strategies for detection of significant coronary artery stenoses in patients with suspected coronary artery disease (CAD): 1) Performing CMR first to assess myocardial ischemia and/or infarct scar before referring positive patients (defined as presence of ischemia and/or infarct scar to coronary angiography (CXA) versus 2) a hypothetical CXA performed in all patients as a single test to detect CAD. METHODS: A subgroup of the European CMR pilot registry was used including 2,717 consecutive patients who underwent stress-CMR. From these patients, 21% were positive for CAD (ischemia and/or infarct scar), 73% negative, and 6% uncertain and underwent additional testing. The diagnostic costs were evaluated using invoicing costs of each test performed. Costs analysis was performed from a health care payer perspective in German, United Kingdom, Swiss, and United States health care settings. RESULTS: In the public sectors of the German, United Kingdom, and Swiss health care systems, cost savings from the CMR-driven strategy were 50%, 25% and 23%, respectively, versus outpatient CXA. If CXA was carried out as an inpatient procedure, cost savings were 46%, 50% and 48%, respectively. In the United States context, cost savings were 51% when compared with inpatient CXA, but higher for CMR by 8% versus outpatient CXA. CONCLUSION: This analysis suggests that from an economic perspective, the use of CMR should be encouraged as a management option for patients with suspected CAD.
Resumo:
A method of objectively determining imaging performance for a mammography quality assurance programme for digital systems was developed. The method is based on the assessment of the visibility of a spherical microcalcification of 0.2 mm using a quasi-ideal observer model. It requires the assessment of the spatial resolution (modulation transfer function) and the noise power spectra of the systems. The contrast is measured using a 0.2-mm thick Al sheet and Polymethylmethacrylate (PMMA) blocks. The minimal image quality was defined as that giving a target contrast-to-noise ratio (CNR) of 5.4. Several evaluations of this objective method for evaluating image quality in mammography quality assurance programmes have been considered on computed radiography (CR) and digital radiography (DR) mammography systems. The measurement gives a threshold CNR necessary to reach the minimum standard image quality required with regards to the visibility of a 0.2-mm microcalcification. This method may replace the CDMAM image evaluation and simplify the threshold contrast visibility test used in mammography quality.
Resumo:
BACKGROUND/AIMS: For many therapeutic decisions in Crohn's disease (CD), high-grade evidence is lacking. To assist clinical decision-making, explicit panel-based appropriateness criteria were developed by an international, multidisciplinary expert panel. METHODS: 10 gastroenterologists, 3 surgeons and 2 general practitioners from 12 European countries assessed the appropriateness of therapy for CD using the RAND Appropriateness Method. Their assessment was based on the study of a recent literature review of the subject, combined with their own expert clinical judgment. Panelists rated clinical indications and treatment options using a 9-point scale (1 = extremely inappropriate; 9 = extremely appropriate). These scenarios were then discussed in detail at the panel meeting and re-rated. Median ratings and disagreement were used to aggregate ratings into three assessment categories: appropriate (A), uncertain (U) and inappropriate (I). RESULTS: 569 specific indications were rated, dealing with 9 clinical presentations: mild/moderate luminal CD (n = 104), severe CD (n = 126), steroid-dependent CD (n = 25), steroid-refractory CD (n = 37), fistulizing CD (n = 49), fibrostenotic CD (n = 35), maintenance of medical remission of CD (n = 84), maintenance of surgical remission (n = 78), drug safety in pregnancy (n = 24) and use of infliximab (n = 7). Overall, 146 indications (26%) were judged appropriate, 129 (23%) uncertain and 294 (52%) inappropriate. Frank disagreement was low (14% overall) with the greatest disagreement (54% of scenarios) being observed for treatment of steroid-refractory disease. CONCLUSIONS: Detailed explicit appropriateness criteria for the appropriate use of therapy for CD were developed for the first time by a European expert panel. Disease location, severity and previous treatments were the main factors taken into account. User-friendly access to EPACT criteria is available via an Internet site, www.epact.ch, allowing prospective evaluation and improvement of appropriateness of current CD therapy.
Resumo:
To evaluate how young physicians in training perceive their patients' cardiovascular risk based on the medical charts and their clinical judgment. Cross sectional observational study. University outpatient clinic, Lausanne, Switzerland. Two hundred hypertensive patients and 50 non-hypertensive patients with at least one cardiovascular risk factor. Comparison of the absolute 10-year cardiovascular risk calculated by a computer program based on the Framingham score and adapted for physicians by the WHO/ISH with the perceived risk as assessed clinically by the physicians. Physicians underestimated the 10-year cardiovascular risk of their patients compared to that calculated with the Framingham score. Concordance between methods was 39% for hypertensive patients and 30% for non-hypertensive patients. Underestimation of cardiovascular risks for hypertensive patients was related to the fact they had a stabilized systolic blood pressure under 140 mm Hg (OR = 2.1 [1.1; 4.1]). These data show that young physicians in training often have an incorrect perception of the cardiovascular risk of their patients with a tendency to underestimate the risk. However, the calculated risk could also be slightly overestimated when applying the Framingham Heart Study model to a Swiss population. To implement a systematic evaluation of risk factors in primary care a greater emphasis should be placed on the teaching of cardiovascular risk evaluation and on the implementation of quality improvement programs.
Resumo:
Summary : Forensic science - both as a source of and as a remedy for error potentially leading to judicial error - has been studied empirically in this research. A comprehensive literature review, experimental tests on the influence of observational biases in fingermark comparison, and semistructured interviews with heads of forensic science laboratories/units in Switzerland and abroad were the tools used. For the literature review, some of the areas studied are: the quality of forensic science work in general, the complex interaction between science and law, and specific propositions as to error sources not directly related to the interaction between law and science. A list of potential error sources all the way from the crime scene to the writing of the report has been established as well. For the empirical tests, the ACE-V (Analysis, Comparison, Evaluation, and Verification) process of fingermark comparison was selected as an area of special interest for the study of observational biases, due to its heavy reliance on visual observation and recent cases of misidentifications. Results of the tests performed with forensic science students tend to show that decision-making stages are the most vulnerable to stimuli inducing observational biases. For the semi-structured interviews, eleven senior forensic scientists answered questions on several subjects, for example on potential and existing error sources in their work, of the limitations of what can be done with forensic science, and of the possibilities and tools to minimise errors. Training and education to augment the quality of forensic science have been discussed together with possible solutions to minimise the risk of errors in forensic science. In addition, the time that samples of physical evidence are kept has been determined as well. Results tend to show considerable agreement on most subjects among the international participants. Their opinions on possible explanations for the occurrence of such problems and the relative weight of such errors in the three stages of crime scene, laboratory, and report writing, disagree, however, with opinions widely represented in existing literature. Through the present research it was therefore possible to obtain a better view of the interaction of forensic science and judicial error to propose practical recommendations to minimise their occurrence. Résumé : Les sciences forensiques - considérés aussi bien comme source de que comme remède à l'erreur judiciaire - ont été étudiées empiriquement dans cette recherche. Une revue complète de littérature, des tests expérimentaux sur l'influence du biais de l'observation dans l'individualisation de traces digitales et des entretiens semi-directifs avec des responsables de laboratoires et unités de sciences forensiques en Suisse et à l'étranger étaient les outils utilisés. Pour la revue de littérature, quelques éléments étudies comprennent: la qualité du travail en sciences forensiques en général, l'interaction complexe entre la science et le droit, et des propositions spécifiques quant aux sources d'erreur pas directement liées à l'interaction entre droit et science. Une liste des sources potentielles d'erreur tout le long du processus de la scène de crime à la rédaction du rapport a également été établie. Pour les tests empiriques, le processus d'ACE-V (analyse, comparaison, évaluation et vérification) de l'individualisation de traces digitales a été choisi comme un sujet d'intérêt spécial pour l'étude des effets d'observation, due à son fort recours à l'observation visuelle et dû à des cas récents d'identification erronée. Les résultats des tests avec des étudiants tendent à prouver que les étapes de prise de décision sont les plus vulnérables aux stimuli induisant des biais d'observation. Pour les entretiens semi-structurés, onze forensiciens ont répondu à des questions sur des sujets variés, par exemple sur des sources potentielles et existantes d'erreur dans leur travail, des limitations de ce qui peut être fait en sciences forensiques, et des possibilités et des outils pour réduire au minimum ses erreurs. La formation et l'éducation pour augmenter la qualité des sciences forensiques ont été discutées ainsi que les solutions possibles pour réduire au minimum le risque d'erreurs en sciences forensiques. Le temps que des échantillons sont gardés a été également déterminé. En général, les résultats tendent à montrer un grand accord sur la plupart des sujets abordés pour les divers participants internationaux. Leur avis sur des explications possibles pour l'occurrence de tels problèmes et sur le poids relatif de telles erreurs dans les trois étapes scène de crime;', laboratoire et rédaction de rapports est cependant en désaccord avec les avis largement représentés dans la littérature existante. Par cette recherche il était donc possible d'obtenir une meilleure vue de l'interaction des sciences forensiques et de l'erreur judiciaire afin de proposer des recommandations pratiques pour réduire au minimum leur occurrence. Zusammenfassung : Forensische Wissenschaften - als Ursache und als Hilfsmittel gegen Fehler, die möglicherweise zu Justizirrtümern führen könnten - sind hier empirisch erforscht worden. Die eingestzten Methoden waren eine Literaturübersicht, experimentelle Tests über den Einfluss von Beobachtungseffekten (observer bias) in der Individualisierung von Fingerabdrücken und halbstandardisierte Interviews mit Verantwortlichen von kriminalistischen Labors/Diensten in der Schweiz und im Ausland. Der Literaturüberblick umfasst unter anderem: die Qualität der kriminalistischen Arbeit im Allgemeinen, die komplizierte Interaktion zwischen Wissenschaft und Recht und spezifische Fehlerquellen, welche nicht direkt auf der Interaktion von Recht und Wissenschaft beruhen. Eine Liste möglicher Fehlerquellen vom Tatort zum Rapportschreiben ist zudem erstellt worden. Für die empirischen Tests wurde der ACE-V (Analyse, Vergleich, Auswertung und Überprüfung) Prozess in der Fingerabdruck-Individualisierung als speziell interessantes Fachgebiet für die Studie von Beobachtungseffekten gewählt. Gründe sind die Wichtigkeit von visuellen Beobachtungen und kürzliche Fälle von Fehlidentifizierungen. Resultate der Tests, die mit Studenten durchgeführt wurden, neigen dazu Entscheidungsphasen als die anfälligsten für Stimuli aufzuzeigen, die Beobachtungseffekte anregen könnten. Für die halbstandardisierten Interviews beantworteten elf Forensiker Fragen über Themen wie zum Beispiel mögliche und vorhandene Fehlerquellen in ihrer Arbeit, Grenzen der forensischen Wissenschaften und Möglichkeiten und Mittel um Fehler zu verringern. Wie Training und Ausbildung die Qualität der forensischen Wissenschaften verbessern können ist zusammen mit möglichen Lösungen zur Fehlervermeidung im selben Bereich diskutiert worden. Wie lange Beweismitten aufbewahrt werden wurde auch festgehalten. Resultate neigen dazu, für die meisten Themen eine grosse Übereinstimmung zwischen den verschiedenen internationalen Teilnehmern zu zeigen. Ihre Meinungen über mögliche Erklärungen für das Auftreten solcher Probleme und des relativen Gewichts solcher Fehler in den drei Phasen Tatort, Labor und Rapportschreiben gehen jedoch mit den Meinungen, welche in der Literatur vertreten werden auseinander. Durch diese Forschungsarbeit war es folglich möglich, ein besseres Verständnis der Interaktion von forensischen Wissenschaften und Justizirrtümer zu erhalten, um somit praktische Empfehlungen vorzuschlagen, welche diese verringern. Resumen : Esta investigación ha analizado de manera empírica el rol de las ciencias forenses como fuente y como remedio de potenciales errores judiciales. La metodología empleada consistió en una revisión integral de la literatura, en una serie de experimentos sobre la influencia de los sesgos de observación en la individualización de huellas dactilares y en una serie de entrevistas semiestructuradas con jefes de laboratorios o unidades de ciencias forenses en Suiza y en el extranjero. En la revisión de la literatura, algunas de las áreas estudiadas fueron: la calidad del trabajo en ciencias forenses en general, la interacción compleja entre la ciencia y el derecho, así como otras fuentes de error no relacionadas directamente con la interacción entre derecho y ciencia. También se ha establecido una lista exhaustiva de las fuentes potenciales de error desde la llegada a la escena del crimen a la redacción del informe. En el marco de los tests empíricos, al analizar los sesgos de observación dedicamos especial interés al proceso de ACE-V (análisis, comparación, evaluación y verificación) para la individualización de huellas dactilares puesto que este reposa sobre la observación visual y ha originado varios casos recientes de identificaciones erróneas. Los resultados de las experimentaciones realizadas con estudiantes sugieren que las etapas en las que deben tornarse decisiones son las más vulnerables a lös factores que pueden generar sesgos de observación. En el contexto de las entrevistas semi-estructuradas, once científicos forenses de diversos países contestaron preguntas sobre varios temas, incluyendo las fuentes potenciales y existehtes de error en su trabajo, las limitaciones propias a las ciencias forenses, las posibilidades de reducir al mínimo los errores y las herramientas que podrían ser utilizadas para ello. Se han sugerido diversas soluciones para alcanzar este objetivo, incluyendo el entrenamiento y la educación para aumentar la calidad de las ciencias forenses. Además, se ha establecido el periodo de conservación de las muestras judiciales. Los resultados apuntan a un elevado grado de consenso entre los entrevistados en la mayoría de los temas. Sin embargo, sus opiniones sobre las posibles causas de estos errores y su importancia relativa en las tres etapas de la investigación -la escena del crimen, el laboratorio y la redacción de informe- discrepan con las que predominan ampliamente en la literatura actual. De este modo, esta investigación nos ha permitido obtener una mejor imagen de la interacción entre ciencias forenses y errores judiciales, y comenzar a formular una serie de recomendaciones prácticas para reducirlos al minimo.
Resumo:
PURPOSE: Cardiovascular magnetic resonance (CMR) has become a robust and important diagnostic imaging modality in cardiovascular medicine. However,insufficient image quality may compromise its diagnostic accuracy. No standardized criteria are available to assess the quality of CMR studies. We aimed todescribe and validate standardized criteria to evaluate the quality of CMR studies including: a) cine steady-state free precession, b) delayed gadoliniumenhancement, and c) adenosine stress first-pass perfusion. These criteria will serve for the assessment of the image quality in the setting of the Euro-CMR registry.METHOD AND MATERIALS: First, a total of 45 quality criteria were defined (35 qualitative criteria with a score from 0-3, and 10 quantitative criteria). Thequalitative score ranged from 0 to 105. The lower the qualitative score, the better the quality. The quantitative criteria were based on the absolute signal intensity (delayed enhancement) and on the signal increase (perfusion) of the anterior/posterior left ventricular wall after gadolinium injection. These criteria were then applied in 30 patients scanned with a 1.5T system and in 15 patients scanned with a 3.0T system. The examinations were jointly interpreted by 3 CMR experts and 1 study nurse. In these 45 patients the correlation between the results of the quality assessment obtained by the different readers was calculated.RESULTS: On the 1.5T machine, the mean quality score was 3.5. The mean difference between each pair of observers was 0.2 (5.7%) with a mean standarddeviation of 1.4. On the 3.0T machine, the mean quality score was 4.4. The mean difference between each pair of onservers was 0.3 (6.4%) with a meanstandard deviation of 1.6. The quantitative quality assessments between observers were well correlated for the 1.5T machine: R was between 0.78 and 0.99 (pCONCLUSION: The described criteria for the assessment of CMR image quality are robust and have a low inter-observer variability, especially on 1.5T systems.CLINICAL RELEVANCE/APPLICATION: These criteria will allow the standardization of CMR examinations. They will help to improve the overall quality ofexaminations and the comparison between clinical studies.
Resumo:
BACKGROUND: The quality of colon cleansing is a major determinant of quality of colonoscopy. To our knowledge, the impact of bowel preparation on the quality of colonoscopy has not been assessed prospectively in a large multicenter study. Therefore, this study assessed the factors that determine colon-cleansing quality and the impact of cleansing quality on the technical performance and diagnostic yield of colonoscopy. METHODS: Twenty-one centers from 11 countries participated in this prospective observational study. Colon-cleansing quality was assessed on a 5-point scale and was categorized on 3 levels. The clinical indication for colonoscopy, diagnoses, and technical parameters related to colonoscopy were recorded. RESULTS: A total of 5832 patients were included in the study (48.7% men, mean age 57.6 [15.9] years). Cleansing quality was lower in elderly patients and in patients in the hospital. Procedures in poorly prepared patients were longer, more difficult, and more often incomplete. The detection of polyps of any size depended on cleansing quality: odds ratio (OR) 1.73: 95% confidence interval (CI)[1.28, 2.36] for intermediate-quality compared with low-quality preparation; and OR 1.46: 95% CI[1.11, 1.93] for high-quality compared with low-quality preparation. For polyps >10 mm in size, corresponding ORs were 1.0 for low-quality cleansing, OR 1.83: 95% CI[1.11, 3.05] for intermediate-quality cleansing, and OR 1.72: 95% CI[1.11, 2.67] for high-quality cleansing. Cancers were not detected less frequently in the case of poor preparation. CONCLUSIONS: Cleansing quality critically determines quality, difficulty, speed, and completeness of colonoscopy, and is lower in hospitalized patients and patients with higher levels of comorbid conditions. The proportion of patients who undergo polypectomy increases with higher cleansing quality, whereas colon cancer detection does not seem to critically depend on the quality of bowel preparation.
Resumo:
The determination of line crossing sequences between rollerball pens and laser printers presents difficulties that may not be overcome using traditional techniques. This research aimed to study the potential of digital microscopy and 3-D laser profilometry to determine line crossing sequences between a toner and an aqueous ink line. Different paper types, rollerball pens, and writing pressure were tested. Correct opinions of the sequence were given for all case scenarios, using both techniques. When the toner was printed before the ink, a light reflection was observed in all crossing specimens, while this was never observed in the other sequence types. The 3-D laser profilometry, more time-consuming, presented the main advantage of providing quantitative results. The findings confirm the potential of the 3-D laser profilometry and demonstrate the efficiency of digital microscopy as a new technique for determining the sequence of line crossings involving rollerball pen ink and toner. With the mass marketing of laser printers and the popularity of rollerball pens, the determination of line crossing sequences between such instruments is encountered by forensic document examiners. This type of crossing presents difficulties with optical microscopic line crossing techniques involving ballpoint pens or gel pens and toner (1-4). Indeed, the rollerball's aqueous ink penetrates through the toner and is absorbed by the fibers of the paper, leaving the examiner with the impression that the toner is above the ink even when it is not (5). Novotny and Westwood (3) investigated the possibility of determining aqueous ink and toner crossing sequences by microscopic observation of the intersection before and after toner removal. A major disadvantage of their study resides in destruction of the sample by scraping off the toner line to see what was underneath. The aim of this research was to investigate the ways to overcome these difficulties through digital microscopy and three-dimensional (3-D) laser profilometry. The former was used as a technique for the determination of sequences between gel pen and toner printing strokes, but provided less conclusive results than that of an optical stereomicroscope (4). 3-D laser profilometry, which allows one to observe and measure the topography of a surface, has been the subject of a number of recent studies in this area. Berx and De Kinder (6) and Schirripa Spagnolo (7,8) have tested the application of laser profilometry to determine the sequence of intersections of several lines. The results obtained in these studies overcome disadvantages of other methods applied in this area, such as scanning electron microscope or the atomic force microscope. The main advantages of 3-D laser profilometry include the ease of implementation of the technique and its nondestructive nature, which does not require sample preparation (8-10). Moreover, the technique is reproducible and presents a high degree of freedom in the vertical axes (up to 1000 μm). However, when the paper surface presents a given roughness, if the pen impressions alter the paper with a depth similar to the roughness of medium, the results are not always conclusive (8). It becomes difficult in this case to distinguish which characteristics can be imputed to the pen impressions or the quality of the paper surface. This important limitation is assessed by testing different types of paper of variable quality (of different grammage and finishing) and the writing pressure. The authors will therefore assess the limits of 3-D laser profilometry technique and determine whether the method can overcome such constraints. Second, the authors will investigate the use of digital microscopy because it presents a number of advantages: it is efficient, user-friendly, and provides an objective evaluation and interpretation.
Resumo:
The sample preparation method preceding the urinary erythropoietin (EPO) doping test is based on several concentration and ultrafiltration steps. In order to improve the quality of isoelectric focusing (IEF) gel results and therefore, the sensitivity of the EPO test, new sample preparation methods relying on affinity purification were recently proposed. This article focuses on the evaluation and validation of disposable immunoaffinity columns targeting both endogenous and recombinant EPO molecules in two World Anti-Doping Agency (WADA) accredited anti-doping laboratories. The use of the columns improved the resolution of the IEF profiles considerably when compared with the classical ultrafiltration method, and the columns' ability to ensure the isoform integrity of the endogenous and exogenous EPO molecules was confirmed. Immunoaffinity columns constitute therefore a potent and reliable tool for the preparation of urine samples and their use will significantly improve the sensitivity and specificity of the actual urinary EPO test.