245 resultados para Integration testing
Resumo:
BACKGROUND: The in vivo transfer of naked plasmid DNA into organs such as muscles is commonly used to assess the expression of prophylactic or therapeutic genes in animal disease models. RESULTS: In this study, we devised vectors allowing a tight regulation of transgene expression in mice from such non-viral vectors using a doxycycline-controlled network of activator and repressor proteins. Using these vectors, we demonstrate proper physiological response as consequence of the induced expression of two therapeutically relevant proteins, namely erythropoietin and utrophin. Kinetic studies showed that the induction of transgene expression was only transient, unless epigenetic regulatory elements termed Matrix Attachment Regions, or MAR, were inserted upstream of the regulated promoters. Using episomal plasmid rescue and quantitative PCR assays, we observed that similar amounts of plasmids remained in muscles after electrotransfer with or without MAR elements, but that a significant portion had integrated into the muscle fiber chromosomes. Interestingly, the MAR elements were found to promote plasmid genomic integration but to oppose silencing effects in vivo, thereby mediating long-term expression. CONCLUSIONS: This study thus elucidates some of the determinants of transient or sustained expression from the use of non-viral regulated vectors in vivo.
Resumo:
Humans experience the self as localized within their body. This aspect of bodily self-consciousness can be experimentally manipulated by exposing individuals to conflicting multisensory input, or can be abnormal following focal brain injury. Recent technological developments helped to unravel some of the mechanisms underlying multisensory integration and self-location, but the neural underpinnings are still under investigation, and the manual application of stimuli resulted in large variability difficult to control. This paper presents the development and evaluation of an MR-compatible stroking device capable of presenting moving tactile stimuli to both legs and the back of participants lying on a scanner bed while acquiring functional neuroimaging data. The platform consists of four independent stroking devices with a travel of 16-20 cm and a maximum stroking velocity of 15 cm/s, actuated over non-magnetic ultrasonic motors. Complemented with virtual reality, this setup provides a unique research platform allowing to investigate multisensory integration and its effects on self-location under well-controlled experimental conditions. The MR-compatibility of the system was evaluated in both a 3 and a 7 Tesla scanner and showed negligible interference with brain imaging. In a preliminary study using a prototype device with only one tactile stimulator, fMRI data acquired on 12 healthy participants showed visuo-tactile synchrony-related and body-specific modulations of the brain activity in bilateral temporoparietal cortex.
Resumo:
In the 1920s, Ronald Fisher developed the theory behind the p value and Jerzy Neyman and Egon Pearson developed the theory of hypothesis testing. These distinct theories have provided researchers important quantitative tools to confirm or refute their hypotheses. The p value is the probability to obtain an effect equal to or more extreme than the one observed presuming the null hypothesis of no effect is true; it gives researchers a measure of the strength of evidence against the null hypothesis. As commonly used, investigators will select a threshold p value below which they will reject the null hypothesis. The theory of hypothesis testing allows researchers to reject a null hypothesis in favor of an alternative hypothesis of some effect. As commonly used, investigators choose Type I error (rejecting the null hypothesis when it is true) and Type II error (accepting the null hypothesis when it is false) levels and determine some critical region. If the test statistic falls into that critical region, the null hypothesis is rejected in favor of the alternative hypothesis. Despite similarities between the two, the p value and the theory of hypothesis testing are different theories that often are misunderstood and confused, leading researchers to improper conclusions. Perhaps the most common misconception is to consider the p value as the probability that the null hypothesis is true rather than the probability of obtaining the difference observed, or one that is more extreme, considering the null is true. Another concern is the risk that an important proportion of statistically significant results are falsely significant. Researchers should have a minimum understanding of these two theories so that they are better able to plan, conduct, interpret, and report scientific experiments.
Resumo:
Significant progress has been made with regard to the quantitative integration of geophysical and hydrological data at the local scale. However, extending the corresponding approaches to the regional scale represents a major, and as-of-yet largely unresolved, challenge. To address this problem, we have developed a downscaling procedure based on a non-linear Bayesian sequential simulation approach. The basic objective of this algorithm is to estimate the value of the sparsely sampled hydraulic conductivity at non-sampled locations based on its relation to the electrical conductivity, which is available throughout the model space. The in situ relationship between the hydraulic and electrical conductivities is described through a non-parametric multivariate kernel density function. This method is then applied to the stochastic integration of low-resolution, re- gional-scale electrical resistivity tomography (ERT) data in combination with high-resolution, local-scale downhole measurements of the hydraulic and electrical conductivities. Finally, the overall viability of this downscaling approach is tested and verified by performing and comparing flow and transport simulation through the original and the downscaled hydraulic conductivity fields. Our results indicate that the proposed procedure does indeed allow for obtaining remarkably faithful estimates of the regional-scale hydraulic conductivity structure and correspondingly reliable predictions of the transport characteristics over relatively long distances.
Resumo:
Histological subtyping and grading by malignancy are the cornerstones of the World Health Organization (WHO) classification of tumors of the central nervous system. They shall provide clinicians with guidance as to the course of disease to be expected and the choices of treatment to be made. Nonetheless, patients with histologically identical tumors may have very different outcomes, notably in patients with astrocytic and oligodendroglial gliomas of WHO grades II and III. In gliomas of adulthood, 3 molecular markers have undergone extensive studies in recent years: 1p/19q chromosomal codeletion, O(6)-methylguanine methyltransferase (MGMT) promoter methylation, and mutations of isocitrate dehydrogenase (IDH) 1 and 2. However, the assessment of these molecular markers has so far not been implemented in clinical routine because of the lack of therapeutic implications. In fact, these markers were considered to be prognostic irrespective of whether patients were receiving radiotherapy (RT), chemotherapy, or both (1p/19q, IDH1/2), or of limited value because testing is too complex and no chemotherapy alternative to temozolomide was available (MGMT). In 2012, this situation has changed: long-term follow-up of the Radiation Therapy Oncology Group 9402 and European Organisation for Research and Treatment of Cancer 26951 trials demonstrated an overall survival benefit from the addition to RT of chemotherapy with procarbazine/CCNU/vincristine confined to patients with anaplastic oligodendroglial tumors with (vs without) 1p/19q codeletion. Furthermore, in elderly glioblastoma patients, the NOA-08 and the Nordic trial of RT alone versus temozolomide alone demonstrated a profound impact of MGMT promoter methylation on outcome by therapy and thus established MGMT as a predictive biomarker in this patient population. These recent results call for the routine implementation of 1p/19q and MGMT testing at least in subpopulations of malignant glioma patients and represent an encouraging step toward the development of personalized therapeutic approaches in neuro-oncology.
Resumo:
OBJECTIVES: To obtain information about the prevalence of, reasons for, and adequacy of HIV testing in the general population in Switzerland in 1992. DESIGN: Telephone survey (n = 2800). RESULTS: Some 47% of the sample underwent one HIV test performed through blood donation (24%), voluntary testing (17%) or both (6%). Of the sample, 46% considered themselves well or very well informed about the HIV test. Patients reported unsystematic pre-test screening by doctors for the main HIV risks. People having been in situations of potential exposure to risk were more likely to have had the test than others. Overall, 85% of those HIV-tested had a relevant, generally risk-related reason for having it performed. CONCLUSIONS: HIV testing is widespread in Switzerland. Testing is mostly performed for relevant reasons. Pre-test counselling is poor and an opportunity for prevention is thus lost.
Resumo:
When researchers introduce a new test they have to demonstrate that it is valid, using unbiased designs and suitable statistical procedures. In this article we use Monte Carlo analyses to highlight how incorrect statistical procedures (i.e., stepwise regression, extreme scores analyses) or ignoring regression assumptions (e.g., heteroscedasticity) contribute to wrong validity estimates. Beyond these demonstrations, and as an example, we re-examined the results reported by Warwick, Nettelbeck, and Ward (2010) concerning the validity of the Ability Emotional Intelligence Measure (AEIM). Warwick et al. used the wrong statistical procedures to conclude that the AEIM was incrementally valid beyond intelligence and personality traits in predicting various outcomes. In our re-analysis, we found that the reliability-corrected multiple correlation of their measures with personality and intelligence was up to .69. Using robust statistical procedures and appropriate controls, we also found that the AEIM did not predict incremental variance in GPA, stress, loneliness, or well-being, demonstrating the importance for testing validity instead of looking for it.
Resumo:
Novel therapeutic agents targeting the epidermal growth factor receptor (EGFR) have improved outcomes for patients with colorectal carcinoma. However, these therapies are effective only in a subset of patients. Activating mutations in the KRAS gene are found in 30-40% of colorectal tumors and are associated with poor response to anti-EGFR therapies. Thus, KRAS mutation status can predict which patient may or may not benefit from anti-EGFR therapy. Although many diagnostic tools have been developed for KRAS mutation analysis, validated methods and standardized testing procedures are lacking. This poses a challenge for the optimal use of anti-EGFR therapies in the management of colorectal carcinoma. Here we review the molecular basis of EGFR-targeted therapies and the resistance to treatment conferred by KRAS mutations. We also present guideline recommendations and a proposal for a European quality assurance program to help ensure accuracy and proficiency in KRAS mutation testing across the European Union.
Resumo:
Glioblastomas are the most malignant gliomas with median survival times of only 15 months despite modern therapies. All standard treatments are palliative. Pathogenetic factors are diverse, hence, stratified treatment plans are warranted considering the molecular heterogeneity among these tumors. However, most patients are treated with "one fits all" standard therapies, many of them with minor response and major toxicities. The integration of clinical and molecular information, now becoming available using new tools such as gene arrays, proteomics, and molecular imaging, will take us to an era where more targeted and effective treatments may be implemented. A first step towards the design of such therapies is the identification of relevant molecular mechanisms driving the aggressive biological behavior of glioblastoma. The accumulation of diverse aberrations in regulatory processes enables tumor cells to bypass the effects of most classical therapies available. Molecular alterations underlying such mechanisms comprise aberrations on the genetic level, such as point mutations of distinct genes, or amplifications and deletions, while others result from epigenetic modifications such as aberrant methylation of CpG islands in the regulatory sequence of genes. Epigenetic silencing of the MGMT gene encoding a DNA repair enzyme was recently found to be of predictive value in a randomized clinical trial for newly diagnosed glioblastoma testing the addition of the alkylating agent temozolomide to standard radiotherapy. Determination of the methylation status of the MGMT promoter may become the first molecular diagnostic tool to identify patients most likely to respond that will allow individually tailored therapy in glioblastoma. To date, the test for the MGMT-methylation status is the only tool available that may direct the choice for alkylating agents in glioblastoma patients, but many others may hopefully become part of an arsenal to stratify patients to respective targeted therapies within the next years.
Resumo:
Ga(3+) is a semimetal element that competes for the iron-binding sites of transporters and enzymes. We investigated the activity of gallium maltolate (GaM), an organic gallium salt with high solubility, against laboratory and clinical strains of methicillin-susceptible Staphylococcus aureus (MSSA), methicillin-resistant S. aureus (MRSA), methicillin-susceptible Staphylococcus epidermidis (MSSE), and methicillin-resistant S. epidermidis (MRSE) in logarithmic or stationary phase and in biofilms. The MICs of GaM were higher for S. aureus (375 to 2000 microg/ml) than S. epidermidis (94 to 200 microg/ml). Minimal biofilm inhibitory concentrations were 3,000 to >or=6,000 microg/ml (S. aureus) and 94 to 3,000 microg/ml (S. epidermidis). In time-kill studies, GaM exhibited a slow and dose-dependent killing, with maximal action at 24 h against S. aureus of 1.9 log(10) CFU/ml (MSSA) and 3.3 log(10) CFU/ml (MRSA) at 3x MIC and 2.9 log(10) CFU/ml (MSSE) and 4.0 log(10) CFU/ml (MRSE) against S. epidermidis at 10x MIC. In calorimetric studies, growth-related heat production was inhibited by GaM at subinhibitory concentrations; and the minimal heat inhibition concentrations were 188 to 4,500 microg/ml (MSSA), 94 to 1,500 microg/ml (MRSA), and 94 to 375 microg/ml (MSSE and MRSE), which correlated well with the MICs. Thus, calorimetry was a fast, accurate, and simple method useful for investigation of antimicrobial activity at subinhibitory concentrations. In conclusion, GaM exhibited activity against staphylococci in different growth phases, including in stationary phase and biofilms, but high concentrations were required. These data support the potential topical use of GaM, including its use for the treatment of wound infections, MRSA decolonization, and coating of implants.