154 resultados para user testing


Relevância:

20.00% 20.00%

Publicador:

Resumo:

There is a need for more efficient methods giving insight into the complex mechanisms of neurotoxicity. Testing strategies including in vitro methods have been proposed to comply with this requirement. With the present study we aimed to develop a novel in vitro approach which mimics in vivo complexity, detects neurotoxicity comprehensively, and provides mechanistic insight. For this purpose we combined rat primary re-aggregating brain cell cultures with a mass spectrometry (MS)-based metabolomics approach. For the proof of principle we treated developing re-aggregating brain cell cultures for 48h with the neurotoxicant methyl mercury chloride (0.1-100muM) and the brain stimulant caffeine (1-100muM) and acquired cellular metabolic profiles. To detect toxicant-induced metabolic alterations the profiles were analysed using commercial software which revealed patterns in the multi-parametric dataset by principal component analyses (PCA), and recognised the most significantly altered metabolites. PCA revealed concentration-dependent cluster formations for methyl mercury chloride (0.1-1muM), and treatment-dependent cluster formations for caffeine (1-100muM) at sub-cytotoxic concentrations. Four relevant metabolites responsible for the concentration-dependent alterations following methyl mercury chloride treatment could be identified using MS-MS fragmentation analysis. These were gamma-aminobutyric acid, choline, glutamine, creatine and spermine. Their respective mass ion intensities demonstrated metabolic alterations in line with the literature and suggest that the metabolites could be biomarkers for mechanisms of neurotoxicity or neuroprotection. In addition, we evaluated whether the approach could identify neurotoxic potential by testing eight compounds which have target organ toxicity in the liver, kidney or brain at sub-cytotoxic concentrations. PCA revealed cluster formations largely dependent on target organ toxicity indicating possible potential for the development of a neurotoxicity prediction model. With such results it could be useful to perform a validation study to determine the reliability, relevance and applicability of this approach to neurotoxicity screening. Thus, for the first time we show the benefits and utility of in vitro metabolomics to comprehensively detect neurotoxicity and to discover new biomarkers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the 1920s, Ronald Fisher developed the theory behind the p value and Jerzy Neyman and Egon Pearson developed the theory of hypothesis testing. These distinct theories have provided researchers important quantitative tools to confirm or refute their hypotheses. The p value is the probability to obtain an effect equal to or more extreme than the one observed presuming the null hypothesis of no effect is true; it gives researchers a measure of the strength of evidence against the null hypothesis. As commonly used, investigators will select a threshold p value below which they will reject the null hypothesis. The theory of hypothesis testing allows researchers to reject a null hypothesis in favor of an alternative hypothesis of some effect. As commonly used, investigators choose Type I error (rejecting the null hypothesis when it is true) and Type II error (accepting the null hypothesis when it is false) levels and determine some critical region. If the test statistic falls into that critical region, the null hypothesis is rejected in favor of the alternative hypothesis. Despite similarities between the two, the p value and the theory of hypothesis testing are different theories that often are misunderstood and confused, leading researchers to improper conclusions. Perhaps the most common misconception is to consider the p value as the probability that the null hypothesis is true rather than the probability of obtaining the difference observed, or one that is more extreme, considering the null is true. Another concern is the risk that an important proportion of statistically significant results are falsely significant. Researchers should have a minimum understanding of these two theories so that they are better able to plan, conduct, interpret, and report scientific experiments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The determination of line crossing sequences between rollerball pens and laser printers presents difficulties that may not be overcome using traditional techniques. This research aimed to study the potential of digital microscopy and 3-D laser profilometry to determine line crossing sequences between a toner and an aqueous ink line. Different paper types, rollerball pens, and writing pressure were tested. Correct opinions of the sequence were given for all case scenarios, using both techniques. When the toner was printed before the ink, a light reflection was observed in all crossing specimens, while this was never observed in the other sequence types. The 3-D laser profilometry, more time-consuming, presented the main advantage of providing quantitative results. The findings confirm the potential of the 3-D laser profilometry and demonstrate the efficiency of digital microscopy as a new technique for determining the sequence of line crossings involving rollerball pen ink and toner. With the mass marketing of laser printers and the popularity of rollerball pens, the determination of line crossing sequences between such instruments is encountered by forensic document examiners. This type of crossing presents difficulties with optical microscopic line crossing techniques involving ballpoint pens or gel pens and toner (1-4). Indeed, the rollerball's aqueous ink penetrates through the toner and is absorbed by the fibers of the paper, leaving the examiner with the impression that the toner is above the ink even when it is not (5). Novotny and Westwood (3) investigated the possibility of determining aqueous ink and toner crossing sequences by microscopic observation of the intersection before and after toner removal. A major disadvantage of their study resides in destruction of the sample by scraping off the toner line to see what was underneath. The aim of this research was to investigate the ways to overcome these difficulties through digital microscopy and three-dimensional (3-D) laser profilometry. The former was used as a technique for the determination of sequences between gel pen and toner printing strokes, but provided less conclusive results than that of an optical stereomicroscope (4). 3-D laser profilometry, which allows one to observe and measure the topography of a surface, has been the subject of a number of recent studies in this area. Berx and De Kinder (6) and Schirripa Spagnolo (7,8) have tested the application of laser profilometry to determine the sequence of intersections of several lines. The results obtained in these studies overcome disadvantages of other methods applied in this area, such as scanning electron microscope or the atomic force microscope. The main advantages of 3-D laser profilometry include the ease of implementation of the technique and its nondestructive nature, which does not require sample preparation (8-10). Moreover, the technique is reproducible and presents a high degree of freedom in the vertical axes (up to 1000 μm). However, when the paper surface presents a given roughness, if the pen impressions alter the paper with a depth similar to the roughness of medium, the results are not always conclusive (8). It becomes difficult in this case to distinguish which characteristics can be imputed to the pen impressions or the quality of the paper surface. This important limitation is assessed by testing different types of paper of variable quality (of different grammage and finishing) and the writing pressure. The authors will therefore assess the limits of 3-D laser profilometry technique and determine whether the method can overcome such constraints. Second, the authors will investigate the use of digital microscopy because it presents a number of advantages: it is efficient, user-friendly, and provides an objective evaluation and interpretation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Histological subtyping and grading by malignancy are the cornerstones of the World Health Organization (WHO) classification of tumors of the central nervous system. They shall provide clinicians with guidance as to the course of disease to be expected and the choices of treatment to be made. Nonetheless, patients with histologically identical tumors may have very different outcomes, notably in patients with astrocytic and oligodendroglial gliomas of WHO grades II and III. In gliomas of adulthood, 3 molecular markers have undergone extensive studies in recent years: 1p/19q chromosomal codeletion, O(6)-methylguanine methyltransferase (MGMT) promoter methylation, and mutations of isocitrate dehydrogenase (IDH) 1 and 2. However, the assessment of these molecular markers has so far not been implemented in clinical routine because of the lack of therapeutic implications. In fact, these markers were considered to be prognostic irrespective of whether patients were receiving radiotherapy (RT), chemotherapy, or both (1p/19q, IDH1/2), or of limited value because testing is too complex and no chemotherapy alternative to temozolomide was available (MGMT). In 2012, this situation has changed: long-term follow-up of the Radiation Therapy Oncology Group 9402 and European Organisation for Research and Treatment of Cancer 26951 trials demonstrated an overall survival benefit from the addition to RT of chemotherapy with procarbazine/CCNU/vincristine confined to patients with anaplastic oligodendroglial tumors with (vs without) 1p/19q codeletion. Furthermore, in elderly glioblastoma patients, the NOA-08 and the Nordic trial of RT alone versus temozolomide alone demonstrated a profound impact of MGMT promoter methylation on outcome by therapy and thus established MGMT as a predictive biomarker in this patient population. These recent results call for the routine implementation of 1p/19q and MGMT testing at least in subpopulations of malignant glioma patients and represent an encouraging step toward the development of personalized therapeutic approaches in neuro-oncology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVES: To obtain information about the prevalence of, reasons for, and adequacy of HIV testing in the general population in Switzerland in 1992. DESIGN: Telephone survey (n = 2800). RESULTS: Some 47% of the sample underwent one HIV test performed through blood donation (24%), voluntary testing (17%) or both (6%). Of the sample, 46% considered themselves well or very well informed about the HIV test. Patients reported unsystematic pre-test screening by doctors for the main HIV risks. People having been in situations of potential exposure to risk were more likely to have had the test than others. Overall, 85% of those HIV-tested had a relevant, generally risk-related reason for having it performed. CONCLUSIONS: HIV testing is widespread in Switzerland. Testing is mostly performed for relevant reasons. Pre-test counselling is poor and an opportunity for prevention is thus lost.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE: To explore the user-friendliness and ergonomics of seven new generation intensive care ventilators. DESIGN: Prospective task-performing study. SETTING: Intensive care research laboratory, university hospital. METHODS: Ten physicians experienced in mechanical ventilation, but without prior knowledge of the ventilators, were asked to perform eight specific tasks [turning the ventilator on; recognizing mode and parameters; recognizing and setting alarms; mode change; finding and activating the pre-oxygenation function; pressure support setting; stand-by; finding and activating non-invasive ventilation (NIV) mode]. The time needed for each task was compared to a reference time (by trained physiotherapist familiar with the devices). A time >180 s was considered a task failure. RESULTS: For each of the tests on the ventilators, all physicians' times were significantly higher than the reference time (P < 0.001). A mean of 13 +/- 8 task failures (16%) was observed by the ventilator. The most frequently failed tasks were mode and parameter recognition, starting pressure support and finding the NIV mode. Least often failed tasks were turning on the pre-oxygenation function and alarm recognition and management. Overall, there was substantial heterogeneity between machines, some exhibiting better user-friendliness than others for certain tasks, but no ventilator was clearly better that the others on all points tested. CONCLUSIONS: The present study adds to the available literature outlining the ergonomic shortcomings of mechanical ventilators. These results suggest that closer ties between end-users and manufacturers should be promoted, at an early development phase of these machines, based on the scientific evaluation of the cognitive processes involved by users in the clinical setting.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

When researchers introduce a new test they have to demonstrate that it is valid, using unbiased designs and suitable statistical procedures. In this article we use Monte Carlo analyses to highlight how incorrect statistical procedures (i.e., stepwise regression, extreme scores analyses) or ignoring regression assumptions (e.g., heteroscedasticity) contribute to wrong validity estimates. Beyond these demonstrations, and as an example, we re-examined the results reported by Warwick, Nettelbeck, and Ward (2010) concerning the validity of the Ability Emotional Intelligence Measure (AEIM). Warwick et al. used the wrong statistical procedures to conclude that the AEIM was incrementally valid beyond intelligence and personality traits in predicting various outcomes. In our re-analysis, we found that the reliability-corrected multiple correlation of their measures with personality and intelligence was up to .69. Using robust statistical procedures and appropriate controls, we also found that the AEIM did not predict incremental variance in GPA, stress, loneliness, or well-being, demonstrating the importance for testing validity instead of looking for it.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Novel therapeutic agents targeting the epidermal growth factor receptor (EGFR) have improved outcomes for patients with colorectal carcinoma. However, these therapies are effective only in a subset of patients. Activating mutations in the KRAS gene are found in 30-40% of colorectal tumors and are associated with poor response to anti-EGFR therapies. Thus, KRAS mutation status can predict which patient may or may not benefit from anti-EGFR therapy. Although many diagnostic tools have been developed for KRAS mutation analysis, validated methods and standardized testing procedures are lacking. This poses a challenge for the optimal use of anti-EGFR therapies in the management of colorectal carcinoma. Here we review the molecular basis of EGFR-targeted therapies and the resistance to treatment conferred by KRAS mutations. We also present guideline recommendations and a proposal for a European quality assurance program to help ensure accuracy and proficiency in KRAS mutation testing across the European Union.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Ga(3+) is a semimetal element that competes for the iron-binding sites of transporters and enzymes. We investigated the activity of gallium maltolate (GaM), an organic gallium salt with high solubility, against laboratory and clinical strains of methicillin-susceptible Staphylococcus aureus (MSSA), methicillin-resistant S. aureus (MRSA), methicillin-susceptible Staphylococcus epidermidis (MSSE), and methicillin-resistant S. epidermidis (MRSE) in logarithmic or stationary phase and in biofilms. The MICs of GaM were higher for S. aureus (375 to 2000 microg/ml) than S. epidermidis (94 to 200 microg/ml). Minimal biofilm inhibitory concentrations were 3,000 to >or=6,000 microg/ml (S. aureus) and 94 to 3,000 microg/ml (S. epidermidis). In time-kill studies, GaM exhibited a slow and dose-dependent killing, with maximal action at 24 h against S. aureus of 1.9 log(10) CFU/ml (MSSA) and 3.3 log(10) CFU/ml (MRSA) at 3x MIC and 2.9 log(10) CFU/ml (MSSE) and 4.0 log(10) CFU/ml (MRSE) against S. epidermidis at 10x MIC. In calorimetric studies, growth-related heat production was inhibited by GaM at subinhibitory concentrations; and the minimal heat inhibition concentrations were 188 to 4,500 microg/ml (MSSA), 94 to 1,500 microg/ml (MRSA), and 94 to 375 microg/ml (MSSE and MRSE), which correlated well with the MICs. Thus, calorimetry was a fast, accurate, and simple method useful for investigation of antimicrobial activity at subinhibitory concentrations. In conclusion, GaM exhibited activity against staphylococci in different growth phases, including in stationary phase and biofilms, but high concentrations were required. These data support the potential topical use of GaM, including its use for the treatment of wound infections, MRSA decolonization, and coating of implants.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Despite the increase of animal and plant introductions worldwide and the strong augmentation of the reptile trade, few invasive snake populations have been studied. Dice snakes (Natrix tessellata) were introduced to the shores of Lake Geneva (Switzerland) in the early 1920s, and are now well established. This region of introduction was previously inhabited by Viperine snakes (N. maura). Ever since these two species have been under monitoring (which began in 1996) the Viperine snake population has shown drastic decline. We examine here the possibility of trophic competition by analysing diet composition, prey size and trophic niche overlap. Spatial distribution is also assessed in order to address the question of spatial competitive exclusion. We found very similar diets, and thus a high trophic niche overlap, indicating no partitioning of the trophic resource. No arguments in favour of spatial competitive exclusion were found. Our study suggests that trophic competition may occur between the two natricines and that it may give an explanation for the drastic decline of the Viperine snake in this area. Other pathways potentially playing a role in the exclusion of the Viperine snake are discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Knowledge of normal heart weight ranges is important information for pathologists. Comparing the measured heart weight to reference values is one of the key elements used to determine if the heart is pathological, as heart weight increases in many cardiac pathologies. The current reference tables are old and in need of an update. AIMS: The purposes of this study are to establish new reference tables for normal heart weights in the local population and to determine the best predictive factor for normal heart weight. We also aim to provide technical support to calculate the predictive normal heart weight. METHODS: The reference values are based on retrospective analysis of adult Caucasian autopsy cases without any obvious pathology that were collected at the University Centre of Legal Medicine in Lausanne from 2007 to 2011. We selected 288 cases. The mean age was 39.2 years. There were 118 men and 170 women. Regression analyses were performed to assess the relationship of heart weight to body weight, body height, body mass index (BMI) and body surface area (BSA). RESULTS: The heart weight increased along with an increase in all the parameters studied. The mean heart weight was greater in men than in women at a similar body weight. BSA was determined to be the best predictor for normal heart weight. New reference tables for predicted heart weights are presented as a web application that enable the comparison of heart weights observed at autopsy with the reference values. CONCLUSIONS: The reference tables for heart weight and other organs should be systematically updated and adapted for the local population. Web access and smartphone applications for the predicted heart weight represent important investigational tools.