143 resultados para Statistical tools
Resumo:
The application of DNA-based markers toward the task of discriminating among alternate salmon runs has evolved in accordance with ongoing genomic developments and increasingly has enabled resolution of which genetic markers associate with important life-history differences. Accurate and efficient identification of the most likely origin for salmon encountered during ocean fisheries, or at salvage from fresh water diversion and monitoring facilities, has far-reaching consequences for improving measures for management, restoration and conservation. Near-real-time provision of high-resolution identity information enables prompt response to changes in encounter rates. We thus continue to develop new tools to provide the greatest statistical power for run identification. As a proof of concept for genetic identification improvements, we conducted simulation and blind tests for 623 known-origin Chinook salmon (Oncorhynchus tshawytscha) to compare and contrast the accuracy of different population sampling baselines and microsatellite loci panels. This test included 35 microsatellite loci (1266 alleles), some known to be associated with specific coding regions of functional significance, such as the circadian rhythm cryptochrome genes, and others not known to be associated with any functional importance. The identification of fall run with unprecedented accuracy was demonstrated. Overall, the top performing panel and baseline (HMSC21) were predicted to have a success rate of 98%, but the blind-test success rate was 84%. Findings for bias or non-bias are discussed to target primary areas for further research and resolution.
Resumo:
Boundaries for delta, representing a "quantitatively significant" or "substantively impressive" distinction, have not been established, analogous to the boundary of alpha, usually set at 0.05, for the stochastic or probabilistic component of "statistical significance". To determine what boundaries are being used for the "quantitative" decisions, we reviewed pertinent articles in three general medical journals. For each contrast of two means, contrast of two rates, or correlation coefficient, we noted the investigators' decisions about stochastic significance, stated in P values or confidence intervals, and about quantitative significance, indicated by interpretive comments. The boundaries between impressive and unimpressive distinctions were best formed by a ratio of greater than or equal to 1.2 for the smaller to the larger mean in 546 comparisons, by a standardized increment of greater than or equal to 0.28 and odds ratio of greater than or equal to 2.2 in 392 comparisons of two rates; and by an r value of greater than or equal to 0.32 in 154 correlation coefficients. Additional boundaries were also identified for "substantially" and "highly" significant quantitative distinctions. Although the proposed boundaries should be kept flexible, indexes and boundaries for decisions about "quantitative significance" are particularly useful when a value of delta must be chosen for calculating sample size before the research is done, and when the "statistical significance" of completed research is appraised for its quantitative as well as stochastic components.
Resumo:
Laser desorption ionisation mass spectrometry (LDI-MS) has demonstrated to be an excellent analytical method for the forensic analysis of inks on a questioned document. The ink can be analysed directly on its substrate (paper) and hence offers a fast method of analysis as sample preparation is kept to a minimum and more importantly, damage to the document is minimised. LDI-MS has also previously been reported to provide a high power of discrimination in the statistical comparison of ink samples and has the potential to be introduced as part of routine ink analysis. This paper looks into the methodology further and evaluates statistically the reproducibility and the influence of paper on black gel pen ink LDI-MS spectra; by comparing spectra of three different black gel pen inks on three different paper substrates. Although generally minimal, the influences of sample homogeneity and paper type were found to be sample dependent. This should be taken into account to avoid the risk of false differentiation of black gel pen ink samples. Other statistical approaches such as principal component analysis (PCA) proved to be a good alternative to correlation coefficients for the comparison of whole mass spectra.
Resumo:
Les syndromes neuropathiques sont caractérisés par une douleur d'intensité élevée, de longue durée et résistante aux analgésiques classiques. De fait, il existe un risque important de répercussions sur la vie et le bien-être des patients. A travers une vignette clinique, cet article abordera le diagnostic, le traitement spécifique et l'impact de la douleur neuropathique sur la qualité de vie et les conséquences psychologiques associées, comme la dépression et l'anxiété. Nous présenterons des outils validés qui permettent d'objectiver la composante neuropathique aux douleurs et les comorbidités psychiatriques associées. Cette évaluation globale favorise un meilleur dialogue avec les patients ainsi que l'élaboration de stratégies thérapeutiques, notamment par le biais d'antidépresseurs, dont l'efficacité sera discutée en fin d'article. Neuropathic pain syndromes are characterized by intense and long lasting pain that is resistant to usual analgesics. Patients are therefore at high risk of decreased quality of life and impaired well-being. Using a case report, we will consider in this article the diagnosis and treatment of neuropathic pain as well as its impact on the quality of life including psychological consequences such as depression and anxiety. We will present simple and reliable scales that can help the general practitioner evaluate the neuropathic component of the pain syndrome and its related psychiatric co-morbidities. This comprehensive approach to pain management should facilitate communication with the patient and help the practitioner select the most appropriate therapeutic strategy, notably the prescription of antidepressants, the efficacy of which we will discuss at the end of the article.
Resumo:
Bio-nano interactions can be defined as the study of interactions between nanoscale entities and biological systems such as, but not limited to, peptides, proteins, lipids, DNA and other biomolecules, cells and cellular receptors and organisms including humans. Studying bio-nano interactions is particularly useful for understanding engineered materials that have at least one dimension in the nanoscale. Such materials may consist of discrete particles or nanostructured surfaces. Much of biology functions at the nanoscale; therefore, our ability to manipulate materials such that they are taken up at the nanoscale, and engage biological machinery in a designed and purposeful manner, opens new vistas for more efficient diagnostics, therapeutics (treatments) and tissue regeneration, so-called nanomedicine. Additionally, this ability of nanomaterials to interact with and be taken up by cells allows nanomaterials to be used as probes and tools to advance our understanding of cellular functioning. Yet, as a new technology, assessment of the safety of nanomaterials, and the applicability of existing regulatory frameworks for nanomaterials must be investigated in parallel with development of novel applications. The Royal Society meeting 'Bio-nano interactions: new tools, insights and impacts' provided an important platform for open dialogue on the current state of knowledge on these issues, bringing together scientists, industry, regulatory and legal experts to concretize existing discourse in science law and policy. This paper summarizes these discussions and the insights that emerged.
Resumo:
Résumé Le but final de ce projet est d'utiliser des cellules T ou des cellules souches mésenchymateuses modifiées génétiquement afin de surexprimer localement les deux chémokines CXCL13 et CCL2 ensemble ou chacune séparément à l'intérieur d'une tumeur solide. CXCL13 est supposé induire des structures lymphoïdes ectopiques. Un niveau élevé de CCL2 est présumé initier une inflammation aiguë. La combinaison des deux effets amène à un nouveau modèle d'étude des mécanismes régulateur de la tolérance périphérique et de l'immunité tumorale. Les connaissances acquises grâce à ce modèle pourraient permettre le développement ou l'amélioration des thérapies immunes du cancer. Le but premier de ce travail a été l'établissement d'un modèle génétique de la souris permettant d'exprimer spécifiquement dans la tumeur les deux chémokines d'intérêt à des niveaux élevés. Pour accomplir cette tâche, qui est en fait une thérapie génétique de tumeurs solides, deux types de cellules porteuses potentielles ont été évaluées. Des cellules CD8+ T et des cellules mésenchymateuses de la moelle osseuse transférées dans des receveurs portant une tumeur. Si on pouvait répondre aux besoins de la thérapie génétique, indépendamment de la thérapie immune envisagée, on posséderait là un outil précieux pour bien d'autres approches thérapeutiques. Plusieurs lignées de souris transgéniques ont été générées comme source de cellules CD8+ T modifiées afin d'exprimer les chémokines d'intérêt. Dans une approche doublement transgénique les propriétés de deux promoteurs spécifiques de cellules T ont été combinées en utilisant la technologie Cre-loxP. Le promoteur de granzyme B confère une dépendance d'activation et le promoteur distal de lck assure une forte expression constitutive dès que les cellules CD8+ T ont été activées. Les transgènes construits ont montré une bonne performance in vivo et des souris qui expriment CCL2 dans des cellules CD8+ T activées ont été obtenues. Ces cellules peuvent maintenant être utilisées avec différents protocoles pour transférer des cellules T cytotoxiques (CTL) dans des receveurs porteur d'une tumeur, permettant ainsi d'évaluer leur capacité en tant que porteuse de chémokine d'infiltrer la tumeur. L'établissement de souris transgéniques, qui expriment pareillement CXCL13 est prévu dans un avenir proche. L'évaluation de cellules mésenchymateuses de la moelle osseuse a démontré que ces cellules se greffent efficacement dans le stroma tumoral suite à la co-injection avec des cellules tumorales. Cela représente un outil précieux pour la recherche, vu qu'il permet d'introduire des cellules manipulées dans un modèle tumoral. Les résultats confirment partiellement d'autres résultats rapportés dans un modèle amélioré. Cependant, l'efficacité et la spécificité suggérées de la migration systémique de cellules mésenchymateuses de la moelle osseuse dans une tumeur n'ont pas été observées dans notre modèle, ce qui indique, que ces cellules ne se prêtent pas à une utilisation thérapeutique. Un autre résultat majeur de ce travail est l'établissement de cultures de cellules mésenchymateuses de la moelle osseuse in vitro conditionnées par des tumeurs, ce qui a permis à ces cellules de s'étendre plus rapidement en gardant leur capacité de migration et de greffe. Cela offre un autre outil précieux, vu que la culture in vitro est un pas nécessaire pour une manipulation thérapeutique. Abstract The ultimate aim of the presented project is to use genetically modified T cells or mesenchymal stem cells to locally overexpress the two chemokines CXCL13 and CCL2 together or each one alone inside a solid tumor. CXCL13 is supposed to induce ectopic lymphoid structures and a high level of CCL2 is intended to trigger acute inflamation. The combination of these two effects represents a new model for studying mechanisms that regulate peripheral tolerance and tumor immunity. Gained insights may help developing or improving immunotherapy of cancer. The primary goal of the executed work was the establishment of a genetic mouse model that allows tumor-specific expression of high levels of the two chemokines of interest. For accomplishing this task, which represents gene therapy of solid tumors, two types of potentially useful carrier cells were evaluated. CD8+ T cells and mesenchymal bone marrow cells to be used in adoptive cell transfers into tumor-bearing mice. Irrespectively of the envisaged immunotherapy, satisfaction of so far unmet needs of gene therapy would be a highly valuable tool that may be employed by many other therapeutic approaches, too. Several transgenic mouse lines were generated as a source of CD8+ T cells modified to express the chemokines of interest. In a double transgenic approach the properties of two T cell-specific promoters were combined using Cre-loxP technology. The granzyme B promoter confers activation-dependency and the lck distal promoter assures strong constitutive expression once the CD8+ T cell has been activated. The constructed transgenes showed a good performance in vivo and mice expressing CCL2 in activated CD8+ T cells were obtained. These cells can now be used with different protocols for adoptively transferring cytotoxic T cells (CTL) into tumor-bearing recipients, thus allowing to study their capacity as tumor-infiltrating chemokine carrier. The establishment of transgenic mice likewisely expressing CXCL13 is expected in the near future. In addition, T cells from generated single transgenic mice that have high expression of an EGFP reporter in both CD4+ and CD8+ cells can be easily traced in vivo when setting up adoptive transfer conditions. The evaluation of mesenchymal bone marrow cells demonstrated that these cells can efficiently engraft into tumor stroma upon local coinjection with tumor cells. This represents a valuable tool for research purposes as it allows to introduce manipulated stromal cells into a tumor model. Therefore, the established engraftment model is suited for studying the envisaged immunotherapy. These results confirm to some extend previously reported results in an improved model, however, the suggested systemic tumor homing efficiency and specificity of mesenchymal bone marrow cells was not observed in our model indicating that these cells may not be suited for therapeutic use. Another major result of the presented work is the establishment oftumor-conditioned in vitro culture of mesenchymal bone marrow cells, which allowed to more rapidly expand these cells while maintaining their tumor homing and engrafting capacities. This offers another valuable tool as in vitro culture is a necessary step for therapeutic manipulations.
Resumo:
Purpose: To evaluate the diagnostic value and image quality of CT with filtered back projection (FBP) compared with adaptive statistical iterative reconstructed images (ASIR) in body stuffers with ingested cocaine-filled packets.Methods and Materials: Twenty-nine body stuffers (mean age 31.9 years, 3 women) suspected for ingestion of cocaine-filled packets underwent routine-dose 64-row multidetector CT with FBP (120kV, pitch 1.375, 100-300 mA and automatic tube current modulation (auto mA), rotation time 0.7sec, collimation 2.5mm), secondarily reconstructed with 30 % and 60 % ASIR. In 13 (44.83%) out of the body stuffers cocaine-filled packets were detected, confirmed by exact analysis of the faecal content including verification of the number (range 1-25). Three radiologists independently and blindly evaluated anonymous CT examinations (29 FBP-CT and 68 ASIR-CT) for the presence and number of cocaine-filled packets indicating observers' confidence, and graded them for diagnostic quality, image noise, and sharpness. Sensitivity, specificity, area under the receiver operating curve (ROC) Az and interobserver agreement between the 3 radiologists for FBP-CT and ASIR-CT were calculated.Results: The increase of the percentage of ASIR significantly diminished the objective image noise (p<0.001). Overall sensitivity and specificity for the detection of the cocaine-filled packets were 87.72% and 76.15%, respectively. The difference of ROC area Az between the different reconstruction techniques was significant (p= 0.0101), that is 0.938 for FBP-CT, 0.916 for 30 % ASIR-CT, and 0.894 for 60 % ASIR-CT.Conclusion: Despite the evident image noise reduction obtained by ASIR, the diagnostic value for detecting cocaine-filled packets decreases, depending on the applied ASIR percentage.
Resumo:
Uncertainty quantification of petroleum reservoir models is one of the present challenges, which is usually approached with a wide range of geostatistical tools linked with statistical optimisation or/and inference algorithms. Recent advances in machine learning offer a novel approach to model spatial distribution of petrophysical properties in complex reservoirs alternative to geostatistics. The approach is based of semisupervised learning, which handles both ?labelled? observed data and ?unlabelled? data, which have no measured value but describe prior knowledge and other relevant data in forms of manifolds in the input space where the modelled property is continuous. Proposed semi-supervised Support Vector Regression (SVR) model has demonstrated its capability to represent realistic geological features and describe stochastic variability and non-uniqueness of spatial properties. On the other hand, it is able to capture and preserve key spatial dependencies such as connectivity of high permeability geo-bodies, which is often difficult in contemporary petroleum reservoir studies. Semi-supervised SVR as a data driven algorithm is designed to integrate various kind of conditioning information and learn dependences from it. The semi-supervised SVR model is able to balance signal/noise levels and control the prior belief in available data. In this work, stochastic semi-supervised SVR geomodel is integrated into Bayesian framework to quantify uncertainty of reservoir production with multiple models fitted to past dynamic observations (production history). Multiple history matched models are obtained using stochastic sampling and/or MCMC-based inference algorithms, which evaluate posterior probability distribution. Uncertainty of the model is described by posterior probability of the model parameters that represent key geological properties: spatial correlation size, continuity strength, smoothness/variability of spatial property distribution. The developed approach is illustrated with a fluvial reservoir case. The resulting probabilistic production forecasts are described by uncertainty envelopes. The paper compares the performance of the models with different combinations of unknown parameters and discusses sensitivity issues.
Resumo:
Familial searching consists of searching for a full profile left at a crime scene in a National DNA Database (NDNAD). In this paper we are interested in the circumstance where no full match is returned, but a partial match is found between a database member's profile and the crime stain. Because close relatives share more of their DNA than unrelated persons, this partial match may indicate that the crime stain was left by a close relative of the person with whom the partial match was found. This approach has successfully solved important crimes in the UK and the USA. In a previous paper, a model, which takes into account substructure and siblings, was used to simulate a NDNAD. In this paper, we have used this model to test the usefulness of familial searching and offer guidelines for pre-assessment of the cases based on the likelihood ratio. Siblings of "persons" present in the simulated Swiss NDNAD were created. These profiles (N=10,000) were used as traces and were then compared to the whole database (N=100,000). The statistical results obtained show that the technique has great potential confirming the findings of previous studies. However, effectiveness of the technique is only one part of the story. Familial searching has juridical and ethical aspects that should not be ignored. In Switzerland for example, there are no specific guidelines to the legality or otherwise of familial searching. This article both presents statistical results, and addresses criminological and civil liberties aspects to take into account risks and benefits of familial searching.