45 resultados para statistical physics


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose: To evaluate the diagnostic value and image quality of CT with filtered back projection (FBP) compared with adaptive statistical iterative reconstructed images (ASIR) in body stuffers with ingested cocaine-filled packets.Methods and Materials: Twenty-nine body stuffers (mean age 31.9 years, 3 women) suspected for ingestion of cocaine-filled packets underwent routine-dose 64-row multidetector CT with FBP (120kV, pitch 1.375, 100-300 mA and automatic tube current modulation (auto mA), rotation time 0.7sec, collimation 2.5mm), secondarily reconstructed with 30 % and 60 % ASIR. In 13 (44.83%) out of the body stuffers cocaine-filled packets were detected, confirmed by exact analysis of the faecal content including verification of the number (range 1-25). Three radiologists independently and blindly evaluated anonymous CT examinations (29 FBP-CT and 68 ASIR-CT) for the presence and number of cocaine-filled packets indicating observers' confidence, and graded them for diagnostic quality, image noise, and sharpness. Sensitivity, specificity, area under the receiver operating curve (ROC) Az and interobserver agreement between the 3 radiologists for FBP-CT and ASIR-CT were calculated.Results: The increase of the percentage of ASIR significantly diminished the objective image noise (p<0.001). Overall sensitivity and specificity for the detection of the cocaine-filled packets were 87.72% and 76.15%, respectively. The difference of ROC area Az between the different reconstruction techniques was significant (p= 0.0101), that is 0.938 for FBP-CT, 0.916 for 30 % ASIR-CT, and 0.894 for 60 % ASIR-CT.Conclusion: Despite the evident image noise reduction obtained by ASIR, the diagnostic value for detecting cocaine-filled packets decreases, depending on the applied ASIR percentage.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Over the last decade, the development of statistical models in support of forensic fingerprint identification has been the subject of increasing research attention, spurned on recently by commentators who claim that the scientific basis for fingerprint identification has not been adequately demonstrated. Such models are increasingly seen as useful tools in support of the fingerprint identification process within or in addition to the ACE-V framework. This paper provides a critical review of recent statistical models from both a practical and theoretical perspective. This includes analysis of models of two different methodologies: Probability of Random Correspondence (PRC) models that focus on calculating probabilities of the occurrence of fingerprint configurations for a given population, and Likelihood Ratio (LR) models which use analysis of corresponding features of fingerprints to derive a likelihood value representing the evidential weighting for a potential source.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Synchrotron radiation X-ray tomographic microscopy is a nondestructive method providing ultra-high-resolution 3D digital images of rock microstructures. We describe this method and, to demonstrate its wide applicability, we present 3D images of very different rock types: Berea sandstone, Fontainebleau sandstone, dolomite, calcitic dolomite, and three-phase magmatic glasses. For some samples, full and partial saturation scenarios are considered using oil, water, and air. The rock images precisely reveal the 3D rock microstructure, the pore space morphology, and the interfaces between fluids saturating the same pore. We provide the raw image data sets as online supplementary material, along with laboratory data describing the rock properties. By making these data sets available to other research groups, we aim to stimulate work based on digital rock images of high quality and high resolution. We also discuss and suggest possible applications and research directions that can be pursued on the basis of our data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Familial searching consists of searching for a full profile left at a crime scene in a National DNA Database (NDNAD). In this paper we are interested in the circumstance where no full match is returned, but a partial match is found between a database member's profile and the crime stain. Because close relatives share more of their DNA than unrelated persons, this partial match may indicate that the crime stain was left by a close relative of the person with whom the partial match was found. This approach has successfully solved important crimes in the UK and the USA. In a previous paper, a model, which takes into account substructure and siblings, was used to simulate a NDNAD. In this paper, we have used this model to test the usefulness of familial searching and offer guidelines for pre-assessment of the cases based on the likelihood ratio. Siblings of "persons" present in the simulated Swiss NDNAD were created. These profiles (N=10,000) were used as traces and were then compared to the whole database (N=100,000). The statistical results obtained show that the technique has great potential confirming the findings of previous studies. However, effectiveness of the technique is only one part of the story. Familial searching has juridical and ethical aspects that should not be ignored. In Switzerland for example, there are no specific guidelines to the legality or otherwise of familial searching. This article both presents statistical results, and addresses criminological and civil liberties aspects to take into account risks and benefits of familial searching.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this article we introduce JULIDE, a software toolkit developed to perform the 3D reconstruction, intensity normalization, volume standardization by 3D image registration and voxel-wise statistical analysis of autoradiographs of mouse brain sections. This software tool has been developed in the open-source ITK software framework and is freely available under a GPL license. The article presents the complete image processing chain from raw data acquisition to 3D statistical group analysis. Results of the group comparison in the context of a study on spatial learning are shown as an illustration of the data that can be obtained with this tool.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Due to their performance enhancing properties, use of anabolic steroids (e.g. testosterone, nandrolone, etc.) is banned in elite sports. Therefore, doping control laboratories accredited by the World Anti-Doping Agency (WADA) screen among others for these prohibited substances in urine. It is particularly challenging to detect misuse with naturally occurring anabolic steroids such as testosterone (T), which is a popular ergogenic agent in sports and society. To screen for misuse with these compounds, drug testing laboratories monitor the urinary concentrations of endogenous steroid metabolites and their ratios, which constitute the steroid profile and compare them with reference ranges to detect unnaturally high values. However, the interpretation of the steroid profile is difficult due to large inter-individual variances, various confounding factors and different endogenous steroids marketed that influence the steroid profile in various ways. A support vector machine (SVM) algorithm was developed to statistically evaluate urinary steroid profiles composed of an extended range of steroid profile metabolites. This model makes the interpretation of the analytical data in the quest for deviating steroid profiles feasible and shows its versatility towards different kinds of misused endogenous steroids. The SVM model outperforms the current biomarkers with respect to detection sensitivity and accuracy, particularly when it is coupled to individual data as stored in the Athlete Biological Passport.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Au cours des deux dernières décennies, la technique d'imagerie arthro-scanner a bénéficié de nombreux progrès technologiques et représente aujourd'hui une excellente alternative à l'imagerie par résonance magnétique (IRM) et / ou arthro-IRM dans l'évaluation des pathologies de la hanche. Cependant, elle reste limitée par l'exposition aux rayonnements ionisants importante. Les techniques de reconstruction itérative (IR) ont récemment été mis en oeuvre avec succès en imagerie ; la littérature montre que l'utilisation ces dernières contribue à réduire la dose d'environ 40 à 55%, comparativement aux protocoles courants utilisant la rétroprojection filtrée (FBP), en scanner de rachis. A notre connaissance, l'utilisation de techniques IR en arthro-scanner de hanche n'a pas été évaluée jusqu'à présent. Le but de notre étude était d'évaluer l'impact de la technique ASIR (GE Healthcare) sur la qualité de l'image objective et subjective en arthro-scanner de hanche, et d'évaluer son potentiel en terme de réduction de dose. Pour cela, trente sept patients examinés par arthro-scanner de hanche ont été randomisés en trois groupes : dose standard (CTDIvol = 38,4 mGy) et deux groupes de dose réduite (CTDIvol = 24,6 ou 15,4 mGy). Les images ont été reconstruites en rétroprojection filtrée (FBP) puis en appliquant différents pourcentages croissants d'ASIR (30, 50, 70 et 90%). Le bruit et le rapport contraste sur bruit (CNR) ont été mesurés. Deux radiologues spécialisés en imagerie musculo-squelettique ont évalué de manière indépendante la qualité de l'image au niveau de plusieurs structures anatomiques en utilisant une échelle de quatre grades. Ils ont également évalué les lésions labrales et du cartilage articulaire. Les résultats révèlent que le bruit augmente (p = 0,0009) et le CNR diminue (p = 0,001) de manière significative lorsque la dose diminue. A l'inverse, le bruit diminue (p = 0,0001) et le contraste sur bruit augmente (p < 0,003) de manière significative lorsque le pourcentage d'ASIR augmente ; on trouve également une augmentation significative des scores de la qualité de l'image pour le labrum, le cartilage, l'os sous-chondral, la qualité de l'image globale (au delà de ASIR 50%), ainsi que le bruit (p < 0,04), et une réduction significative pour l'os trabuculaire et les muscles (p < 0,03). Indépendamment du niveau de dose, il n'y a pas de différence significative pour la détection et la caractérisation des lésions labrales (n=24, p = 1) et des lésions cartilagineuses (n=40, p > 0,89) en fonction du pourcentage d'ASIR. Notre travail a permis de montrer que l'utilisation de plus de 50% d'ASIR permet de reduire de manière significative la dose d'irradiation reçue par le patient lors d'un arthro-scanner de hanche tout en maintenant une qualité d'image diagnostique comparable par rapport à un protocole de dose standard utilisant la rétroprojection filtrée.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

La tomodensitométrie (CT) est une technique d'imagerie dont l'intérêt n'a cessé de croître depuis son apparition dans le début des années 70. Dans le domaine médical, son utilisation est incontournable à tel point que ce système d'imagerie pourrait être amené à devenir victime de son succès si son impact au niveau de l'exposition de la population ne fait pas l'objet d'une attention particulière. Bien évidemment, l'augmentation du nombre d'examens CT a permis d'améliorer la prise en charge des patients ou a rendu certaines procédures moins invasives. Toutefois, pour assurer que le compromis risque - bénéfice soit toujours en faveur du patient, il est nécessaire d'éviter de délivrer des doses non utiles au diagnostic.¦Si cette action est importante chez l'adulte elle doit être une priorité lorsque les examens se font chez l'enfant, en particulier lorsque l'on suit des pathologies qui nécessitent plusieurs examens CT au cours de la vie du patient. En effet, les enfants et jeunes adultes sont plus radiosensibles. De plus, leur espérance de vie étant supérieure à celle de l'adulte, ils présentent un risque accru de développer un cancer radio-induit dont la phase de latence peut être supérieure à vingt ans. Partant du principe que chaque examen radiologique est justifié, il devient dès lors nécessaire d'optimiser les protocoles d'acquisitions pour s'assurer que le patient ne soit pas irradié inutilement. L'avancée technologique au niveau du CT est très rapide et depuis 2009, de nouvelles techniques de reconstructions d'images, dites itératives, ont été introduites afin de réduire la dose et améliorer la qualité d'image.¦Le présent travail a pour objectif de déterminer le potentiel des reconstructions itératives statistiques pour réduire au minimum les doses délivrées lors d'examens CT chez l'enfant et le jeune adulte tout en conservant une qualité d'image permettant le diagnostic, ceci afin de proposer des protocoles optimisés.¦L'optimisation d'un protocole d'examen CT nécessite de pouvoir évaluer la dose délivrée et la qualité d'image utile au diagnostic. Alors que la dose est estimée au moyen d'indices CT (CTDIV0| et DLP), ce travail a la particularité d'utiliser deux approches radicalement différentes pour évaluer la qualité d'image. La première approche dite « physique », se base sur le calcul de métriques physiques (SD, MTF, NPS, etc.) mesurées dans des conditions bien définies, le plus souvent sur fantômes. Bien que cette démarche soit limitée car elle n'intègre pas la perception des radiologues, elle permet de caractériser de manière rapide et simple certaines propriétés d'une image. La seconde approche, dite « clinique », est basée sur l'évaluation de structures anatomiques (critères diagnostiques) présentes sur les images de patients. Des radiologues, impliqués dans l'étape d'évaluation, doivent qualifier la qualité des structures d'un point de vue diagnostique en utilisant une échelle de notation simple. Cette approche, lourde à mettre en place, a l'avantage d'être proche du travail du radiologue et peut être considérée comme méthode de référence.¦Parmi les principaux résultats de ce travail, il a été montré que les algorithmes itératifs statistiques étudiés en clinique (ASIR?, VEO?) ont un important potentiel pour réduire la dose au CT (jusqu'à-90%). Cependant, par leur fonctionnement, ils modifient l'apparence de l'image en entraînant un changement de texture qui pourrait affecter la qualité du diagnostic. En comparant les résultats fournis par les approches « clinique » et « physique », il a été montré que ce changement de texture se traduit par une modification du spectre fréquentiel du bruit dont l'analyse permet d'anticiper ou d'éviter une perte diagnostique. Ce travail montre également que l'intégration de ces nouvelles techniques de reconstruction en clinique ne peut se faire de manière simple sur la base de protocoles utilisant des reconstructions classiques. Les conclusions de ce travail ainsi que les outils développés pourront également guider de futures études dans le domaine de la qualité d'image, comme par exemple, l'analyse de textures ou la modélisation d'observateurs pour le CT.¦-¦Computed tomography (CT) is an imaging technique in which interest has been growing since it first began to be used in the early 1970s. In the clinical environment, this imaging system has emerged as the gold standard modality because of its high sensitivity in producing accurate diagnostic images. However, even if a direct benefit to patient healthcare is attributed to CT, the dramatic increase of the number of CT examinations performed has raised concerns about the potential negative effects of ionizing radiation on the population. To insure a benefit - risk that works in favor of a patient, it is important to balance image quality and dose in order to avoid unnecessary patient exposure.¦If this balance is important for adults, it should be an absolute priority for children undergoing CT examinations, especially for patients suffering from diseases requiring several follow-up examinations over the patient's lifetime. Indeed, children and young adults are more sensitive to ionizing radiation and have an extended life span in comparison to adults. For this population, the risk of developing cancer, whose latency period exceeds 20 years, is significantly higher than for adults. Assuming that each patient examination is justified, it then becomes a priority to optimize CT acquisition protocols in order to minimize the delivered dose to the patient. Over the past few years, CT advances have been developing at a rapid pace. Since 2009, new iterative image reconstruction techniques, called statistical iterative reconstructions, have been introduced in order to decrease patient exposure and improve image quality.¦The goal of the present work was to determine the potential of statistical iterative reconstructions to reduce dose as much as possible without compromising image quality and maintain diagnosis of children and young adult examinations.¦The optimization step requires the evaluation of the delivered dose and image quality useful to perform diagnosis. While the dose is estimated using CT indices (CTDIV0| and DLP), the particularity of this research was to use two radically different approaches to evaluate image quality. The first approach, called the "physical approach", computed physical metrics (SD, MTF, NPS, etc.) measured on phantoms in well-known conditions. Although this technique has some limitations because it does not take radiologist perspective into account, it enables the physical characterization of image properties in a simple and timely way. The second approach, called the "clinical approach", was based on the evaluation of anatomical structures (diagnostic criteria) present on patient images. Radiologists, involved in the assessment step, were asked to score image quality of structures for diagnostic purposes using a simple rating scale. This approach is relatively complicated to implement and also time-consuming. Nevertheless, it has the advantage of being very close to the practice of radiologists and is considered as a reference method.¦Primarily, this work revealed that the statistical iterative reconstructions studied in clinic (ASIR? and VECO have a strong potential to reduce CT dose (up to -90%). However, by their mechanisms, they lead to a modification of the image appearance with a change in image texture which may then effect the quality of the diagnosis. By comparing the results of the "clinical" and "physical" approach, it was showed that a change in texture is related to a modification of the noise spectrum bandwidth. The NPS analysis makes possible to anticipate or avoid a decrease in image quality. This project demonstrated that integrating these new statistical iterative reconstruction techniques can be complex and cannot be made on the basis of protocols using conventional reconstructions. The conclusions of this work and the image quality tools developed will be able to guide future studies in the field of image quality as texture analysis or model observers dedicated to CT.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents and discusses the use of Bayesian procedures - introduced through the use of Bayesian networks in Part I of this series of papers - for 'learning' probabilities from data. The discussion will relate to a set of real data on characteristics of black toners commonly used in printing and copying devices. Particular attention is drawn to the incorporation of the proposed procedures as an integral part in probabilistic inference schemes (notably in the form of Bayesian networks) that are intended to address uncertainties related to particular propositions of interest (e.g., whether or not a sample originates from a particular source). The conceptual tenets of the proposed methodologies are presented along with aspects of their practical implementation using currently available Bayesian network software.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE: In the radiopharmaceutical therapy approach to the fight against cancer, in particular when it comes to translating laboratory results to the clinical setting, modeling has served as an invaluable tool for guidance and for understanding the processes operating at the cellular level and how these relate to macroscopic observables. Tumor control probability (TCP) is the dosimetric end point quantity of choice which relates to experimental and clinical data: it requires knowledge of individual cellular absorbed doses since it depends on the assessment of the treatment's ability to kill each and every cell. Macroscopic tumors, seen in both clinical and experimental studies, contain too many cells to be modeled individually in Monte Carlo simulation; yet, in particular for low ratios of decays to cells, a cell-based model that does not smooth away statistical considerations associated with low activity is a necessity. The authors present here an adaptation of the simple sphere-based model from which cellular level dosimetry for macroscopic tumors and their end point quantities, such as TCP, may be extrapolated more reliably. METHODS: Ten homogenous spheres representing tumors of different sizes were constructed in GEANT4. The radionuclide 131I was randomly allowed to decay for each model size and for seven different ratios of number of decays to number of cells, N(r): 1000, 500, 200, 100, 50, 20, and 10 decays per cell. The deposited energy was collected in radial bins and divided by the bin mass to obtain the average bin absorbed dose. To simulate a cellular model, the number of cells present in each bin was calculated and an absorbed dose attributed to each cell equal to the bin average absorbed dose with a randomly determined adjustment based on a Gaussian probability distribution with a width equal to the statistical uncertainty consistent with the ratio of decays to cells, i.e., equal to Nr-1/2. From dose volume histograms the surviving fraction of cells, equivalent uniform dose (EUD), and TCP for the different scenarios were calculated. Comparably sized spherical models containing individual spherical cells (15 microm diameter) in hexagonal lattices were constructed, and Monte Carlo simulations were executed for all the same previous scenarios. The dosimetric quantities were calculated and compared to the adjusted simple sphere model results. The model was then applied to the Bortezomib-induced enzyme-targeted radiotherapy (BETR) strategy of targeting Epstein-Barr virus (EBV)-expressing cancers. RESULTS: The TCP values were comparable to within 2% between the adjusted simple sphere and full cellular models. Additionally, models were generated for a nonuniform distribution of activity, and results were compared between the adjusted spherical and cellular models with similar comparability. The TCP values from the experimental macroscopic tumor results were consistent with the experimental observations for BETR-treated 1 g EBV-expressing lymphoma tumors in mice. CONCLUSIONS: The adjusted spherical model presented here provides more accurate TCP values than simple spheres, on par with full cellular Monte Carlo simulations while maintaining the simplicity of the simple sphere model. This model provides a basis for complementing and understanding laboratory and clinical results pertaining to radiopharmaceutical therapy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Swain corrects the chi-square overidentification test (i.e., likelihood ratio test of fit) for structural equation models whethr with or without latent variables. The chi-square statistic is asymptotically correct; however, it does not behave as expected in small samples and/or when the model is complex (cf. Herzog, Boomsma, & Reinecke, 2007). Thus, particularly in situations where the ratio of sample size (n) to the number of parameters estimated (p) is relatively small (i.e., the p to n ratio is large), the chi-square test will tend to overreject correctly specified models. To obtain a closer approximation to the distribution of the chi-square statistic, Swain (1975) developed a correction; this scaling factor, which converges to 1 asymptotically, is multiplied with the chi-square statistic. The correction better approximates the chi-square distribution resulting in more appropriate Type 1 reject error rates (see Herzog & Boomsma, 2009; Herzog, et al., 2007).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Particle physics studies highly complex processes which cannot be directly observed. Scientific realism claims that we are nevertheless warranted in believing that these processes really occur and that the objects involved in them really exist. This dissertation defends a version of scientific realism, called causal realism, in the context of particle physics. I start by introducing the central theses and arguments in the recent philosophical debate on scientific realism (chapter 1), with a special focus on an important presupposition of the debate, namely common sense realism. Chapter 2 then discusses entity realism, which introduces a crucial element into the debate by emphasizing the importance of experiments in defending scientific realism. Most of the chapter is concerned with Ian Hacking's position, but I also argue that Nancy Cartwright's version of entity realism is ultimately preferable as a basis for further development. In chapter 3,1 take a step back and consider the question whether the realism debate is worth pursuing at all. Arthur Fine has given a negative answer to that question, proposing his natural ontologica! attitude as an alternative to both realism and antirealism. I argue that the debate (in particular the realist side of it) is in fact less vicious than Fine presents it. The second part of my work (chapters 4-6) develops, illustrates and defends causal realism. The key idea is that inference to the best explanation is reliable in some cases, but not in others. Chapter 4 characterizes the difference between these two kinds of cases in terms of three criteria which distinguish causal from theoretical warrant. In order to flesh out this distinction, chapter 5 then applies it to a concrete case from the history of particle physics, the discovery of the neutrino. This case study shows that the distinction between causal and theoretical warrant is crucial for understanding what it means to "directly detect" a new particle. But the distinction is also an effective tool against what I take to be the presently most powerful objection to scientific realism: Kyle Stanford's argument from unconceived alternatives. I respond to this argument in chapter 6, and I illustrate my response with a discussion of Jean Perrin's experimental work concerning the atomic hypothesis. In the final part of the dissertation, I turn to the specific challenges posed to realism by quantum theories. One of these challenges comes from the experimental violations of Bell's inequalities, which indicate a failure of locality in the quantum domain. I show in chapter 7 how causal realism can further our understanding of quantum non-locality by taking account of some recent experimental results. Another challenge to realism in quantum mechanics comes from delayed-choice experiments, which seem to imply that certain aspects of what happens in an experiment can be influenced by later choices of the experimenter. Chapter 8 analyzes these experiments and argues that they do not warrant the antirealist conclusions which some commentators draw from them. It pays particular attention to the case of delayed-choice entanglement swapping and the corresponding question whether entanglement is a real physical relation. In chapter 9,1 finally address relativistic quantum theories. It is often claimed that these theories are incompatible with a particle ontology, and this calls into question causal realism's commitment to localizable and countable entities. I defend the commitments of causal realism against these objections, and I conclude with some remarks connecting the interpretation of quantum field theory to more general metaphysical issues confronting causal realism.