920 resultados para Science Ability testing


Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE: This study investigated maximal cardiometabolic response while running in a lower body positive pressure treadmill (antigravity treadmill (AG)), which reduces body weight (BW) and impact. The AG is used in rehabilitation of injuries but could have potential for high-speed running, if workload is maximally elevated. METHODS: Fourteen trained (nine male) runners (age 27 ± 5 yr; 10-km personal best, 38.1 ± 1.1 min) completed a treadmill incremental test (CON) to measure aerobic capacity and heart rate (V˙O2max and HRmax). They completed four identical tests (48 h apart, randomized order) on the AG at BW of 100%, 95%, 90%, and 85% (AG100 to AG85). Stride length and rate were measured at peak velocities (Vpeak). RESULTS: V˙O2max (mL·kg·min) was similar across all conditions (men: CON = 66.6 (3.0), AG100 = 65.6 (3.8), AG95 = 65.0 (5.4), AG90 = 65.6 (4.5), and AG85 = 65.0 (4.8); women: CON = 63.0 (4.6), AG100 = 61.4 (4.3), AG95 = 60.7 (4.8), AG90 = 61.4 (3.3), and AG85 = 62.8 (3.9)). Similar results were found for HRmax, except for AG85 in men and AG100 and AG90 in women, which were lower than CON. Vpeak (km·h) in men was 19.7 (0.9) in CON, which was lower than every other condition: AG100 = 21.0 (1.9) (P < 0.05), AG95 = 21.4 (1.8) (P < 0.01), AG90 = 22.3 (2.1) (P < 0.01), and AG85 = 22.6 (1.6) (P < 0.001). In women, Vpeak (km·h) was similar between CON (17.8 (1.1) ) and AG100 (19.3 (1.0)) but higher at AG95 = 19.5 (0.4) (P < 0.05), AG90 = 19.5 (0.8) (P < 0.05), and AG85 = 21.2 (0.9) (P < 0.01). CONCLUSIONS: The AG can be used at maximal exercise intensities at BW of 85% to 95%, reaching faster running speeds than normally feasible. The AG could be used for overspeed running programs at the highest metabolic response levels.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Resume : L'utilisation de l'encre comme indice en sciences forensiques est décrite et encadrée par une littérature abondante, comprenant entre autres deux standards de l'American Society for Testing and Materials (ASTM). La grande majorité de cette littérature se préoccupe de l'analyse des caractéristiques physiques ou chimiques des encres. Les standards ASTM proposent quelques principes de base qui concernent la comparaison et l'interprétation de la valeur d'indice des encres en sciences forensiques. L'étude de cette littérature et plus particulièrement des standards ASTM, en ayant a l'esprit les développements intervenus dans le domaine de l'interprétation de l'indice forensique, montre qu'il existe un potentiel certain pour l'amélioration de l'utilisation de l'indice encre et de son impact dans l'enquête criminelle. Cette thèse propose d'interpréter l'indice encre en se basant sur le cadre défini par le théorème de Bayes. Cette proposition a nécessité le développement d'un système d'assurance qualité pour l'analyse et la comparaison d'échantillons d'encre. Ce système d'assurance qualité tire parti d'un cadre théorique nouvellement défini. La méthodologie qui est proposée dans ce travail a été testée de manière compréhensive, en tirant parti d'un set de données spécialement créer pour l'occasion et d'outils importés de la biométrie. Cette recherche répond de manière convaincante à un problème concret généralement rencontré en sciences forensiques. L'information fournie par le criminaliste, lors de l'examen de traces, est souvent bridée, car celui-ci essaie de répondre à la mauvaise question. L'utilisation d'un cadre théorique explicite qui définit et formalise le goal de l'examen criminaliste, permet de déterminer les besoins technologiques et en matière de données. Le développement de cette technologie et la collection des données pertinentes peut être justifiées économiquement et achevée de manière scientifique. Abstract : The contribution of ink evidence to forensic science is described and supported by an abundant literature and by two standards from the American Society for Testing and Materials (ASTM). The vast majority of the available literature is concerned with the physical and chemical analysis of ink evidence. The relevant ASTM standards mention some principles regarding the comparison of pairs of ink samples and the evaluation of their evidential value. The review of this literature and, more specifically, of the ASTM standards in the light of recent developments in the interpretation of forensic evidence has shown some potential improvements, which would maximise the benefits of the use of ink evidence in forensic science. This thesis proposes to interpret ink evidence using the widely accepted and recommended Bayesian theorem. This proposition has required the development of a new quality assurance process for the analysis and comparison of ink samples, as well as of the definition of a theoretical framework for ink evidence. The proposed technology has been extensively tested using a large dataset of ink samples and state of the art tools, commonly used in biometry. Overall, this research successfully answers to a concrete problem generally encountered in forensic science, where scientists tend to self-limit the usefulness of the information that is present in various types of evidence, by trying to answer to the wrong questions. The declaration of an explicit framework, which defines and formalises their goals and expected contributions to the criminal and civil justice system, enables the determination of their needs in terms of technology and data. The development of this technology and the collection of the data is then justified economically, structured scientifically and can be proceeded efficiently.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Host-choice experiments were carried out with rodent and bat ectoparasites on Ilha Grande, state of Rio de Janeiro, Brazil. We constructed experimental chambers that enclosed three different rodent or bat host species, and then introduced a selected set of ectoparasitic arthropods. When given the opportunity to choose among host species, the ectoparasites showed a strong tendency to select their primary hosts, and reject novel host species. These kinds of simple experiments can be valuable tools for assessing the ability of ectoparasites to locate and discern differences between host species, and make choices about which hosts to infest, and which hosts to avoid.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: The purpose of this study was to assess decision making in patients with multiple sclerosis (MS) at the earliest clinically detectable time point of the disease. METHODS: Patients with definite MS (n = 109) or with clinically isolated syndrome (CIS, n = 56), a disease duration of 3 months to 5 years, and no or only minor neurological impairment (Expanded Disability Status Scale [EDSS] score 0-2.5) were compared to 50 healthy controls using the Iowa Gambling Task (IGT). RESULTS: The performance of definite MS, CIS patients, and controls was comparable for the two main outcomes of the IGT (learning index: p = 0.7; total score: p = 0.6). The IGT learning index was influenced by the educational level and the co-occurrence of minor depression. CIS and MS patients developing a relapse during an observation period of 15 months dated from IGT testing demonstrated a lower learning index in the IGT than patients who had no exacerbation (p = 0.02). When controlling for age, gender and education, the difference between relapsing and non-relapsing patients was at the limit of significance (p = 0.06). CONCLUSION: Decision making in a task mimicking real life decisions is generally preserved in early MS patients as compared to controls. A possible consequence of MS relapsing activity in the impairment of decision making ability is also suspected in the early phase of MS.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Epidermal growth factor receptor (EGFR) and its downstream factors KRAS and BRAF are mutated in several types of cancer, affecting the clinical response to EGFR inhibitors. Mutations in the EGFR kinase domain predict sensitivity to the tyrosine kinase inhibitors gefitinib and erlotinib in lung adenocarcinoma, while activating point mutations in KRAS and BRAF confer resistance to the anti-EGFR monoclonal antibody cetuximab in colorectal cancer. The development of new generation methods for systematic mutation screening of these genes will allow more appropriate therapeutic choices. METHODS: We describe a high resolution melting (HRM) assay for mutation detection in EGFR exons 19-21, KRAS codon 12/13 and BRAF V600 using formalin-fixed paraffin-embedded samples. Somatic variation of KRAS exon 2 was also analysed by massively parallel pyrosequencing of amplicons with the GS Junior 454 platform. RESULTS: We tested 120 routine diagnostic specimens from patients with colorectal or lung cancer. Mutations in KRAS, BRAF and EGFR were observed in 41.9%, 13.0% and 11.1% of the overall samples, respectively, being mutually exclusive. For KRAS, six types of substitutions were detected (17 G12D, 9 G13D, 7 G12C, 2 G12A, 2 G12V, 2 G12S), while V600E accounted for all the BRAF activating mutations. Regarding EGFR, two cases showed exon 19 deletions (delE746-A750 and delE746-T751insA) and another two substitutions in exon 21 (one showed L858R with the resistance mutation T590M in exon 20, and the other had P848L mutation). Consistent with earlier reports, our results show that KRAS and BRAF mutation frequencies in colorectal cancer were 44.3% and 13.0%, respectively, while EGFR mutations were detected in 11.1% of the lung cancer specimens. Ultra-deep amplicon pyrosequencing successfully validated the HRM results and allowed detection and quantitation of KRAS somatic mutations. CONCLUSIONS: HRM is a rapid and sensitive method for moderate-throughput cost-effective screening of oncogene mutations in clinical samples. Rather than Sanger sequence validation, next-generation sequencing technology results in more accurate quantitative results in somatic variation and can be achieved at a higher throughput scale.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The introduction of engineered nanostructured materials into a rapidly increasing number of industrial and consumer products will result in enhanced exposure to engineered nanoparticles. Workplace exposure has been identified as the most likely source of uncontrolled inhalation of engineered aerosolized nanoparticles, but release of engineered nanoparticles may occur at any stage of the lifecycle of (consumer) products. The dynamic development of nanomaterials with possibly unknown toxicological effects poses a challenge for the assessment of nanoparticle induced toxicity and safety.In this consensus document from a workshop on in-vitro cell systems for nanoparticle toxicity testing11Workshop on 'In-Vitro Exposure Studies for Toxicity Testing of Engineered Nanoparticles' sponsored by the Association for Aerosol Research (GAeF), 5-6 September 2009, Karlsruhe, Germany. an overview is given of the main issues concerning exposure to airborne nanoparticles, lung physiology, biological mechanisms of (adverse) action, in-vitro cell exposure systems, realistic tissue doses, risk assessment and social aspects of nanotechnology. The workshop participants recognized the large potential of in-vitro cell exposure systems for reliable, high-throughput screening of nanoparticle toxicity. For the investigation of lung toxicity, a strong preference was expressed for air-liquid interface (ALI) cell exposure systems (rather than submerged cell exposure systems) as they more closely resemble in-vivo conditions in the lungs and they allow for unaltered and dosimetrically accurate delivery of aerosolized nanoparticles to the cells. An important aspect, which is frequently overlooked, is the comparison of typically used in-vitro dose levels with realistic in-vivo nanoparticle doses in the lung. If we consider average ambient urban exposure and occupational exposure at 5mg/m3 (maximum level allowed by Occupational Safety and Health Administration (OSHA)) as the boundaries of human exposure, the corresponding upper-limit range of nanoparticle flux delivered to the lung tissue is 3×10-5-5×10-3μg/h/cm2 of lung tissue and 2-300particles/h/(epithelial) cell. This range can be easily matched and even exceeded by almost all currently available cell exposure systems.The consensus statement includes a set of recommendations for conducting in-vitro cell exposure studies with pulmonary cell systems and identifies urgent needs for future development. As these issues are crucial for the introduction of safe nanomaterials into the marketplace and the living environment, they deserve more attention and more interaction between biologists and aerosol scientists. The members of the workshop believe that further advances in in-vitro cell exposure studies would be greatly facilitated by a more active role of the aerosol scientists. The technical know-how for developing and running ALI in-vitro exposure systems is available in the aerosol community and at the same time biologists/toxicologists are required for proper assessment of the biological impact of nanoparticles.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Influenza surveillance networks must detect early the viruses that will cause the forthcoming annual epidemics and isolate the strains for further characterization. We obtained the highest sensitivity (95.4%) with a diagnostic tool that combined a shell-vial assay and reverse transcription-PCR on cell culture supernatants at 48 h, and indeed, recovered the strain

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nucleic acid amplification techniques are commonly used currently to diagnose viral diseases and manage patients with this kind of illnesses. These techniques have had a rapid but unconventional route of development during the last 30 years, with the discovery and introduction of several assays in clinical diagnosis. The increase in the number of commercially available methods has facilitated the use of this technology in the majority of laboratories worldwide. This technology has reduced the use of some other techniques such as viral culture based methods and serological assays in the clinical virology laboratory. Moreover, nucleic acid amplification techniques are now the methods of reference and also the most useful assays for the diagnosis in several diseases. The introduction of these techniques and their automation provides new opportunities for the clinical laboratory to affect patient care. The main objectives in performing nucleic acid tests in this field are to provide timely results useful for high-quality patient care at a reasonable cost, because rapid results are associated with improvements in patients care. The use of amplification techniques such as polymerase chain reaction, real-time polymerase chain reaction or nucleic acid sequence-based amplification for virus detection, genotyping and quantification have some advantages like high sensitivity and reproducibility, as well as a broad dynamic range. This review is an up-to-date of the main nucleic acid techniques and their clinical applications, and special challenges and opportunities that these techniques currently provide for the clinical virology laboratory.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A test kit based on living, lyophilized bacterial bioreporters emitting bioluminescence as a response to arsenite and arsenate was applied during a field campaign in six villages across Bangladesh. Bioreporter field measurements of arsenic in groundwater from tube wells were in satisfying agreement with the results of spectroscopic analyses of the same samples conducted in the lab. The practicability of the bioreporter test in terms of logistics and material requirements, suitability for high sample throughput, and waste disposal was much better than that of two commercial chemical test kits that were included as references. The campaigns furthermore demonstrated large local heterogeneity of arsenic in groundwater, underscoring the use of well switching as an effective remedy to avoid high arsenic exposure.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction: According to guidelines, patients with coronary artery disease (CAD) should undergo revascularization if myocardial ischemia is present. While coronary angiography (CXA) allows the morphological assessment of CAD, the fractional flow reserve (FFR) has proved to be a complementary invasive test to assess the functional significance of CAD, i.e. to detect ischemia. Perfusion Cardiac Magnetic Resonance (CMR) has turned out to be a robust non-invasive technique to assess myocardial ischemia. The objective: is to compare the cost-effectiveness ratio - defined as the costs per patient correctly diagnosed - of two algorithms used to diagnose hemodynamically significant CAD in relation to the pretest likelihood of CAD: 1) aCMRto assess ischemia before referring positive patients to CXA (CMR + CXA), 2) a CXA in all patients combined with a FFR test in patients with angiographically positive stenoses (CXA + FFR). Methods: The costs, evaluated from the health care system perspective in the Swiss, German, the United Kingdom (UK) and the United States (US) contexts, included public prices of the different tests considered as outpatient procedures, complications' costs and costs induced by diagnosis errors (false negative). The effectiveness criterion wasthe ability to accurately identify apatient with significantCAD.Test performancesused in the model were based on the clinical literature. Using a mathematical model, we compared the cost-effectiveness ratio for both algorithms for hypothetical patient cohorts with different pretest likelihood of CAD. Results: The cost-effectiveness ratio decreased hyperbolically with increasing pretest likelihood of CAD for both strategies. CMR + CXA and CXA + FFR were equally costeffective at a pretest likelihood of CAD of 62% in Switzerland, 67% in Germany, 83% in the UK and 84% in the US with costs of CHF 5'794, Euros 1'472, £ 2'685 and $ 2'126 per patient correctly diagnosed. Below these thresholds, CMR + CXA showed lower costs per patient correctly diagnosed than CXA + FFR. Implications for the health care system/professionals/patients/society These results facilitate decision making for the clinical use of new generations of imaging procedures to detect ischemia. They show to what extent the cost-effectiveness to diagnose CAD depends on the prevalence of the disease.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The vast territories that have been radioactively contaminated during the 1986 Chernobyl accident provide a substantial data set of radioactive monitoring data, which can be used for the verification and testing of the different spatial estimation (prediction) methods involved in risk assessment studies. Using the Chernobyl data set for such a purpose is motivated by its heterogeneous spatial structure (the data are characterized by large-scale correlations, short-scale variability, spotty features, etc.). The present work is concerned with the application of the Bayesian Maximum Entropy (BME) method to estimate the extent and the magnitude of the radioactive soil contamination by 137Cs due to the Chernobyl fallout. The powerful BME method allows rigorous incorporation of a wide variety of knowledge bases into the spatial estimation procedure leading to informative contamination maps. Exact measurements (?hard? data) are combined with secondary information on local uncertainties (treated as ?soft? data) to generate science-based uncertainty assessment of soil contamination estimates at unsampled locations. BME describes uncertainty in terms of the posterior probability distributions generated across space, whereas no assumption about the underlying distribution is made and non-linear estimators are automatically incorporated. Traditional estimation variances based on the assumption of an underlying Gaussian distribution (analogous, e.g., to the kriging variance) can be derived as a special case of the BME uncertainty analysis. The BME estimates obtained using hard and soft data are compared with the BME estimates obtained using only hard data. The comparison involves both the accuracy of the estimation maps using the exact data and the assessment of the associated uncertainty using repeated measurements. Furthermore, a comparison of the spatial estimation accuracy obtained by the two methods was carried out using a validation data set of hard data. Finally, a separate uncertainty analysis was conducted that evaluated the ability of the posterior probabilities to reproduce the distribution of the raw repeated measurements available in certain populated sites. The analysis provides an illustration of the improvement in mapping accuracy obtained by adding soft data to the existing hard data and, in general, demonstrates that the BME method performs well both in terms of estimation accuracy as well as in terms estimation error assessment, which are both useful features for the Chernobyl fallout study.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is common in econometric applications that several hypothesis tests arecarried out at the same time. The problem then becomes how to decide whichhypotheses to reject, accounting for the multitude of tests. In this paper,we suggest a stepwise multiple testing procedure which asymptoticallycontrols the familywise error rate at a desired level. Compared to relatedsingle-step methods, our procedure is more powerful in the sense that itoften will reject more false hypotheses. In addition, we advocate the useof studentization when it is feasible. Unlike some stepwise methods, ourmethod implicitly captures the joint dependence structure of the teststatistics, which results in increased ability to detect alternativehypotheses. We prove our method asymptotically controls the familywise errorrate under minimal assumptions. We present our methodology in the context ofcomparing several strategies to a common benchmark and deciding whichstrategies actually beat the benchmark. However, our ideas can easily beextended and/or modied to other contexts, such as making inference for theindividual regression coecients in a multiple regression framework. Somesimulation studies show the improvements of our methods over previous proposals. We also provide an application to a set of real data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study aims to identify the constraints that complicate the assessment of speaking in Capeverdean EFL classrooms. A literature review was conducted on studies already done in the field of testing speaking in EFL Capeverdean classrooms. The study was carried out using qualitative method with some Capeverdean secondary school English teachers. The participants answered a questionnaire that asked teachers opinions and experiences about speaking assessment. The study found that Capeverdean English teachers do not adequately assess their students speaking ability. Therefore, the study pointed out the constraints of the Capeverdean context that complicate the assessment of speaking in EFL classrooms. The teachers reported the main constraints in order of significance as large classes, difficulty in marking oral tests, difficulty in designing oral tests and difficulty in separating the speaking skill from the listening skill. It concluded that Capeverdean English teachers need assistance with new tools to assess speaking in their classrooms. Thus, the author will make some suggestions, first to the Ministry of Education and then to English teachers in the field to assist them with the implementation of regular oral testing in Cape Verdean English classrooms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Miniature diffusion size classifiers (miniDiSC) are novel handheld devices to measure ultrafine particles (UFP). UFP have been linked to the development of cardiovascular and pulmonary diseases; thus, detection and quantification of these particles are important for evaluating their potential health hazards. As part of the UFP exposure assessments of highwaymaintenance workers in western Switzerland, we compared a miniDiSC with a portable condensation particle counter (P-TRAK). In addition, we performed stationary measurements with a miniDiSC and a scanning mobility particle sizer (SMPS) at a site immediately adjacent to a highway. Measurements with miniDiSC and P-TRAK correlated well (correlation of r = 0.84) but average particle numbers of the miniDiSC were 30%âeuro"60% higher. This difference was significantly increased for mean particle diameters below 40 nm. The correlation between theminiDiSC and the SMPSduring stationary measurements was very high (r = 0.98) although particle numbers from the miniDiSC were 30% lower. Differences between the three devices were attributed to the different cutoff diameters for detection. Correction for this size dependent effect led to very similar results across all counters.We did not observe any significant influence of other particle characteristics. Our results suggest that the miniDiSC provides accurate particle number concentrations and geometric mean diameters at traffic-influenced sites, making it a useful tool for personal exposure assessment in such settings.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of the present study was to investigate the relative importance of flooding- and confinement-related environmentalfeatures in explaining macroinvertebrate trait structure and diversity in a pool of wetlands located in a Mediterranean riverfloodplain. To test hypothesized trait-environment relationships, we employed a recently implemented statistical procedure, thefourth-corner method. We found that flooding-related variables, mainly pH and turbidity, were related to traits that confer an abilityof the organism to resist flooding (e.g., small body-shape, protection of eggs) or recuperate faster after flooding (e.g., short life-span, asexual reproduction). In contrast, confinement-related variables, mainly temperature and organic matter, enhanced traits that allow organisms to interact and compete with other organisms (e.g., large size, sexual reproduction) and to efficiently use habitat and resources (e.g., diverse locomotion and feeding strategies). These results are in agreement with predictions made under the River Habitat Templet for lotic ecosystems, and demonstrate the ability of the fourth-corner method to test hypothesis that posit traitenvironment relationships. Trait diversity was slightly higher in flooded than in confined sites, whereas trait richness was not significantly different. This suggests that although trait structure may change in response to the main environmental factors, as evidenced by the fourth-corner method, the number of life-history strategies needed to persist in the face of such constraints remains more or less constant; only their relative dominance differs