24 resultados para Science Ability testing
em Université de Lausanne, Switzerland
Resumo:
In the first part of this research, three stages were stated for a program to increase the information extracted from ink evidence and maximise its usefulness to the criminal and civil justice system. These stages are (a) develop a standard methodology for analysing ink samples by high-performance thin layer chromatography (HPTLC) in reproducible way, when ink samples are analysed at different time, locations and by different examiners; (b) compare automatically and objectively ink samples; and (c) define and evaluate theoretical framework for the use of ink evidence in forensic context. This report focuses on the second of the three stages. Using the calibration and acquisition process described in the previous report, mathematical algorithms are proposed to automatically and objectively compare ink samples. The performances of these algorithms are systematically studied for various chemical and forensic conditions using standard performance tests commonly used in biometrics studies. The results show that different algorithms are best suited for different tasks. Finally, this report demonstrates how modern analytical and computer technology can be used in the field of ink examination and how tools developed and successfully applied in other fields of forensic science can help maximising its impact within the field of questioned documents.
Resumo:
Many people regard the concept of hypothesis testing as fundamental to inferential statistics. Various schools of thought, in particular frequentist and Bayesian, have promoted radically different solutions for taking a decision about the plausibility of competing hypotheses. Comprehensive philosophical comparisons about their advantages and drawbacks are widely available and continue to span over large debates in the literature. More recently, controversial discussion was initiated by an editorial decision of a scientific journal [1] to refuse any paper submitted for publication containing null hypothesis testing procedures. Since the large majority of papers published in forensic journals propose the evaluation of statistical evidence based on the so called p-values, it is of interest to expose the discussion of this journal's decision within the forensic science community. This paper aims to provide forensic science researchers with a primer on the main concepts and their implications for making informed methodological choices.
Resumo:
PURPOSE: This study investigated maximal cardiometabolic response while running in a lower body positive pressure treadmill (antigravity treadmill (AG)), which reduces body weight (BW) and impact. The AG is used in rehabilitation of injuries but could have potential for high-speed running, if workload is maximally elevated. METHODS: Fourteen trained (nine male) runners (age 27 ± 5 yr; 10-km personal best, 38.1 ± 1.1 min) completed a treadmill incremental test (CON) to measure aerobic capacity and heart rate (V˙O2max and HRmax). They completed four identical tests (48 h apart, randomized order) on the AG at BW of 100%, 95%, 90%, and 85% (AG100 to AG85). Stride length and rate were measured at peak velocities (Vpeak). RESULTS: V˙O2max (mL·kg·min) was similar across all conditions (men: CON = 66.6 (3.0), AG100 = 65.6 (3.8), AG95 = 65.0 (5.4), AG90 = 65.6 (4.5), and AG85 = 65.0 (4.8); women: CON = 63.0 (4.6), AG100 = 61.4 (4.3), AG95 = 60.7 (4.8), AG90 = 61.4 (3.3), and AG85 = 62.8 (3.9)). Similar results were found for HRmax, except for AG85 in men and AG100 and AG90 in women, which were lower than CON. Vpeak (km·h) in men was 19.7 (0.9) in CON, which was lower than every other condition: AG100 = 21.0 (1.9) (P < 0.05), AG95 = 21.4 (1.8) (P < 0.01), AG90 = 22.3 (2.1) (P < 0.01), and AG85 = 22.6 (1.6) (P < 0.001). In women, Vpeak (km·h) was similar between CON (17.8 (1.1) ) and AG100 (19.3 (1.0)) but higher at AG95 = 19.5 (0.4) (P < 0.05), AG90 = 19.5 (0.8) (P < 0.05), and AG85 = 21.2 (0.9) (P < 0.01). CONCLUSIONS: The AG can be used at maximal exercise intensities at BW of 85% to 95%, reaching faster running speeds than normally feasible. The AG could be used for overspeed running programs at the highest metabolic response levels.
Resumo:
Resume : L'utilisation de l'encre comme indice en sciences forensiques est décrite et encadrée par une littérature abondante, comprenant entre autres deux standards de l'American Society for Testing and Materials (ASTM). La grande majorité de cette littérature se préoccupe de l'analyse des caractéristiques physiques ou chimiques des encres. Les standards ASTM proposent quelques principes de base qui concernent la comparaison et l'interprétation de la valeur d'indice des encres en sciences forensiques. L'étude de cette littérature et plus particulièrement des standards ASTM, en ayant a l'esprit les développements intervenus dans le domaine de l'interprétation de l'indice forensique, montre qu'il existe un potentiel certain pour l'amélioration de l'utilisation de l'indice encre et de son impact dans l'enquête criminelle. Cette thèse propose d'interpréter l'indice encre en se basant sur le cadre défini par le théorème de Bayes. Cette proposition a nécessité le développement d'un système d'assurance qualité pour l'analyse et la comparaison d'échantillons d'encre. Ce système d'assurance qualité tire parti d'un cadre théorique nouvellement défini. La méthodologie qui est proposée dans ce travail a été testée de manière compréhensive, en tirant parti d'un set de données spécialement créer pour l'occasion et d'outils importés de la biométrie. Cette recherche répond de manière convaincante à un problème concret généralement rencontré en sciences forensiques. L'information fournie par le criminaliste, lors de l'examen de traces, est souvent bridée, car celui-ci essaie de répondre à la mauvaise question. L'utilisation d'un cadre théorique explicite qui définit et formalise le goal de l'examen criminaliste, permet de déterminer les besoins technologiques et en matière de données. Le développement de cette technologie et la collection des données pertinentes peut être justifiées économiquement et achevée de manière scientifique. Abstract : The contribution of ink evidence to forensic science is described and supported by an abundant literature and by two standards from the American Society for Testing and Materials (ASTM). The vast majority of the available literature is concerned with the physical and chemical analysis of ink evidence. The relevant ASTM standards mention some principles regarding the comparison of pairs of ink samples and the evaluation of their evidential value. The review of this literature and, more specifically, of the ASTM standards in the light of recent developments in the interpretation of forensic evidence has shown some potential improvements, which would maximise the benefits of the use of ink evidence in forensic science. This thesis proposes to interpret ink evidence using the widely accepted and recommended Bayesian theorem. This proposition has required the development of a new quality assurance process for the analysis and comparison of ink samples, as well as of the definition of a theoretical framework for ink evidence. The proposed technology has been extensively tested using a large dataset of ink samples and state of the art tools, commonly used in biometry. Overall, this research successfully answers to a concrete problem generally encountered in forensic science, where scientists tend to self-limit the usefulness of the information that is present in various types of evidence, by trying to answer to the wrong questions. The declaration of an explicit framework, which defines and formalises their goals and expected contributions to the criminal and civil justice system, enables the determination of their needs in terms of technology and data. The development of this technology and the collection of the data is then justified economically, structured scientifically and can be proceeded efficiently.
Resumo:
BACKGROUND: The purpose of this study was to assess decision making in patients with multiple sclerosis (MS) at the earliest clinically detectable time point of the disease. METHODS: Patients with definite MS (n = 109) or with clinically isolated syndrome (CIS, n = 56), a disease duration of 3 months to 5 years, and no or only minor neurological impairment (Expanded Disability Status Scale [EDSS] score 0-2.5) were compared to 50 healthy controls using the Iowa Gambling Task (IGT). RESULTS: The performance of definite MS, CIS patients, and controls was comparable for the two main outcomes of the IGT (learning index: p = 0.7; total score: p = 0.6). The IGT learning index was influenced by the educational level and the co-occurrence of minor depression. CIS and MS patients developing a relapse during an observation period of 15 months dated from IGT testing demonstrated a lower learning index in the IGT than patients who had no exacerbation (p = 0.02). When controlling for age, gender and education, the difference between relapsing and non-relapsing patients was at the limit of significance (p = 0.06). CONCLUSION: Decision making in a task mimicking real life decisions is generally preserved in early MS patients as compared to controls. A possible consequence of MS relapsing activity in the impairment of decision making ability is also suspected in the early phase of MS.
Resumo:
The introduction of engineered nanostructured materials into a rapidly increasing number of industrial and consumer products will result in enhanced exposure to engineered nanoparticles. Workplace exposure has been identified as the most likely source of uncontrolled inhalation of engineered aerosolized nanoparticles, but release of engineered nanoparticles may occur at any stage of the lifecycle of (consumer) products. The dynamic development of nanomaterials with possibly unknown toxicological effects poses a challenge for the assessment of nanoparticle induced toxicity and safety.In this consensus document from a workshop on in-vitro cell systems for nanoparticle toxicity testing11Workshop on 'In-Vitro Exposure Studies for Toxicity Testing of Engineered Nanoparticles' sponsored by the Association for Aerosol Research (GAeF), 5-6 September 2009, Karlsruhe, Germany. an overview is given of the main issues concerning exposure to airborne nanoparticles, lung physiology, biological mechanisms of (adverse) action, in-vitro cell exposure systems, realistic tissue doses, risk assessment and social aspects of nanotechnology. The workshop participants recognized the large potential of in-vitro cell exposure systems for reliable, high-throughput screening of nanoparticle toxicity. For the investigation of lung toxicity, a strong preference was expressed for air-liquid interface (ALI) cell exposure systems (rather than submerged cell exposure systems) as they more closely resemble in-vivo conditions in the lungs and they allow for unaltered and dosimetrically accurate delivery of aerosolized nanoparticles to the cells. An important aspect, which is frequently overlooked, is the comparison of typically used in-vitro dose levels with realistic in-vivo nanoparticle doses in the lung. If we consider average ambient urban exposure and occupational exposure at 5mg/m3 (maximum level allowed by Occupational Safety and Health Administration (OSHA)) as the boundaries of human exposure, the corresponding upper-limit range of nanoparticle flux delivered to the lung tissue is 3×10-5-5×10-3μg/h/cm2 of lung tissue and 2-300particles/h/(epithelial) cell. This range can be easily matched and even exceeded by almost all currently available cell exposure systems.The consensus statement includes a set of recommendations for conducting in-vitro cell exposure studies with pulmonary cell systems and identifies urgent needs for future development. As these issues are crucial for the introduction of safe nanomaterials into the marketplace and the living environment, they deserve more attention and more interaction between biologists and aerosol scientists. The members of the workshop believe that further advances in in-vitro cell exposure studies would be greatly facilitated by a more active role of the aerosol scientists. The technical know-how for developing and running ALI in-vitro exposure systems is available in the aerosol community and at the same time biologists/toxicologists are required for proper assessment of the biological impact of nanoparticles.
Resumo:
A test kit based on living, lyophilized bacterial bioreporters emitting bioluminescence as a response to arsenite and arsenate was applied during a field campaign in six villages across Bangladesh. Bioreporter field measurements of arsenic in groundwater from tube wells were in satisfying agreement with the results of spectroscopic analyses of the same samples conducted in the lab. The practicability of the bioreporter test in terms of logistics and material requirements, suitability for high sample throughput, and waste disposal was much better than that of two commercial chemical test kits that were included as references. The campaigns furthermore demonstrated large local heterogeneity of arsenic in groundwater, underscoring the use of well switching as an effective remedy to avoid high arsenic exposure.
Resumo:
Introduction: According to guidelines, patients with coronary artery disease (CAD) should undergo revascularization if myocardial ischemia is present. While coronary angiography (CXA) allows the morphological assessment of CAD, the fractional flow reserve (FFR) has proved to be a complementary invasive test to assess the functional significance of CAD, i.e. to detect ischemia. Perfusion Cardiac Magnetic Resonance (CMR) has turned out to be a robust non-invasive technique to assess myocardial ischemia. The objective: is to compare the cost-effectiveness ratio - defined as the costs per patient correctly diagnosed - of two algorithms used to diagnose hemodynamically significant CAD in relation to the pretest likelihood of CAD: 1) aCMRto assess ischemia before referring positive patients to CXA (CMR + CXA), 2) a CXA in all patients combined with a FFR test in patients with angiographically positive stenoses (CXA + FFR). Methods: The costs, evaluated from the health care system perspective in the Swiss, German, the United Kingdom (UK) and the United States (US) contexts, included public prices of the different tests considered as outpatient procedures, complications' costs and costs induced by diagnosis errors (false negative). The effectiveness criterion wasthe ability to accurately identify apatient with significantCAD.Test performancesused in the model were based on the clinical literature. Using a mathematical model, we compared the cost-effectiveness ratio for both algorithms for hypothetical patient cohorts with different pretest likelihood of CAD. Results: The cost-effectiveness ratio decreased hyperbolically with increasing pretest likelihood of CAD for both strategies. CMR + CXA and CXA + FFR were equally costeffective at a pretest likelihood of CAD of 62% in Switzerland, 67% in Germany, 83% in the UK and 84% in the US with costs of CHF 5'794, Euros 1'472, £ 2'685 and $ 2'126 per patient correctly diagnosed. Below these thresholds, CMR + CXA showed lower costs per patient correctly diagnosed than CXA + FFR. Implications for the health care system/professionals/patients/society These results facilitate decision making for the clinical use of new generations of imaging procedures to detect ischemia. They show to what extent the cost-effectiveness to diagnose CAD depends on the prevalence of the disease.
Resumo:
The vast territories that have been radioactively contaminated during the 1986 Chernobyl accident provide a substantial data set of radioactive monitoring data, which can be used for the verification and testing of the different spatial estimation (prediction) methods involved in risk assessment studies. Using the Chernobyl data set for such a purpose is motivated by its heterogeneous spatial structure (the data are characterized by large-scale correlations, short-scale variability, spotty features, etc.). The present work is concerned with the application of the Bayesian Maximum Entropy (BME) method to estimate the extent and the magnitude of the radioactive soil contamination by 137Cs due to the Chernobyl fallout. The powerful BME method allows rigorous incorporation of a wide variety of knowledge bases into the spatial estimation procedure leading to informative contamination maps. Exact measurements (?hard? data) are combined with secondary information on local uncertainties (treated as ?soft? data) to generate science-based uncertainty assessment of soil contamination estimates at unsampled locations. BME describes uncertainty in terms of the posterior probability distributions generated across space, whereas no assumption about the underlying distribution is made and non-linear estimators are automatically incorporated. Traditional estimation variances based on the assumption of an underlying Gaussian distribution (analogous, e.g., to the kriging variance) can be derived as a special case of the BME uncertainty analysis. The BME estimates obtained using hard and soft data are compared with the BME estimates obtained using only hard data. The comparison involves both the accuracy of the estimation maps using the exact data and the assessment of the associated uncertainty using repeated measurements. Furthermore, a comparison of the spatial estimation accuracy obtained by the two methods was carried out using a validation data set of hard data. Finally, a separate uncertainty analysis was conducted that evaluated the ability of the posterior probabilities to reproduce the distribution of the raw repeated measurements available in certain populated sites. The analysis provides an illustration of the improvement in mapping accuracy obtained by adding soft data to the existing hard data and, in general, demonstrates that the BME method performs well both in terms of estimation accuracy as well as in terms estimation error assessment, which are both useful features for the Chernobyl fallout study.
Resumo:
Miniature diffusion size classifiers (miniDiSC) are novel handheld devices to measure ultrafine particles (UFP). UFP have been linked to the development of cardiovascular and pulmonary diseases; thus, detection and quantification of these particles are important for evaluating their potential health hazards. As part of the UFP exposure assessments of highwaymaintenance workers in western Switzerland, we compared a miniDiSC with a portable condensation particle counter (P-TRAK). In addition, we performed stationary measurements with a miniDiSC and a scanning mobility particle sizer (SMPS) at a site immediately adjacent to a highway. Measurements with miniDiSC and P-TRAK correlated well (correlation of r = 0.84) but average particle numbers of the miniDiSC were 30%âeuro"60% higher. This difference was significantly increased for mean particle diameters below 40 nm. The correlation between theminiDiSC and the SMPSduring stationary measurements was very high (r = 0.98) although particle numbers from the miniDiSC were 30% lower. Differences between the three devices were attributed to the different cutoff diameters for detection. Correction for this size dependent effect led to very similar results across all counters.We did not observe any significant influence of other particle characteristics. Our results suggest that the miniDiSC provides accurate particle number concentrations and geometric mean diameters at traffic-influenced sites, making it a useful tool for personal exposure assessment in such settings.
Resumo:
The research reported in this series of article aimed at (1) automating the search of questioned ink specimens in ink reference collections and (2) at evaluating the strength of ink evidence in a transparent and balanced manner. These aims require that ink samples are analysed in an accurate and reproducible way and that they are compared in an objective and automated way. This latter requirement is due to the large number of comparisons that are necessary in both scenarios. A research programme was designed to (a) develop a standard methodology for analysing ink samples in a reproducible way, (b) comparing automatically and objectively ink samples and (c) evaluate the proposed methodology in forensic contexts. This report focuses on the last of the three stages of the research programme. The calibration and acquisition process and the mathematical comparison algorithms were described in previous papers [C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part I: Development of a quality assurance process for forensic ink analysis by HPTLC, Forensic Sci. Int. 185 (2009) 29-37; C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science- Part II: Development and testing of mathematical algorithms for the automatic comparison of ink samples analysed by HPTLC, Forensic Sci. Int. 185 (2009) 38-50]. In this paper, the benefits and challenges of the proposed concepts are tested in two forensic contexts: (1) ink identification and (2) ink evidential value assessment. The results show that different algorithms are better suited for different tasks. This research shows that it is possible to build digital ink libraries using the most commonly used ink analytical technique, i.e. high-performance thin layer chromatography, despite its reputation of lacking reproducibility. More importantly, it is possible to assign evidential value to ink evidence in a transparent way using a probabilistic model. It is therefore possible to move away from the traditional subjective approach, which is entirely based on experts' opinion, and which is usually not very informative. While there is room for the improvement, this report demonstrates the significant gains obtained over the traditional subjective approach for the search of ink specimens in ink databases, and the interpretation of their evidential value.
Resumo:
Current measures of ability emotional intelligence (EI)--including the well-known Mayer-Salovey-Caruso Emotional Intelligence Test (MSCEIT)--suffer from several limitations, including low discriminant validity and questionable construct and incremental validity. We show that the MSCEIT is largely predicted by personality dimensions, general intelligence, and demographics having multiple R's with the MSCEIT branches up to .66; for the general EI factor this relation was even stronger (Multiple R = .76). As concerns the factor structure of the MSCEIT, we found support for four first-order factors, which had differential relations with personality, but no support for a higher-order global EI factor. We discuss implications for employing the MSCEIT, including (a) using the single branches scores rather than the total score, (b) always controlling for personality and general intelligence to ensure unbiased parameter estimates in the EI factors, and (c) correcting for measurement error. Failure to account for these methodological aspects may severely compromise predictive validity testing. We also discuss avenues for the improvement of ability-based tests.
Resumo:
Recently, pharmaceutical industry developed a new class of therapeutics called Selective Androgen Receptor Modulator (SARM) to substitute the synthetic anabolic drugs used in medical treatments. Since the beginning of the anti-doping testing in sports in the 1970s, steroids have been the most frequently detected drugs mainly used for their anabolic properties. The major advantage of SARMs is the reduced androgenic activities which are the main source of side effects following anabolic agents' administration. In 2010, the Swiss laboratory for doping analyses reported the first case of SARMs abuse during in-competition testing. The analytical steps leading to this finding are described in this paper. Screening and confirmation results were obtained based on liquid chromatography tandem mass spectrometry (LC-MS/MS) analyses. Additional information regarding the SARM S-4 metabolism was investigated by ultra high-pressure liquid chromatography coupled to quadrupole time-of-flight mass spectrometer (UHPLC-QTOF-MS).