14 resultados para gold standard
em Duke University
Resumo:
INTRODUCTION: The characterization of urinary calculi using noninvasive methods has the potential to affect clinical management. CT remains the gold standard for diagnosis of urinary calculi, but has not reliably differentiated varying stone compositions. Dual-energy CT (DECT) has emerged as a technology to improve CT characterization of anatomic structures. This study aims to assess the ability of DECT to accurately discriminate between different types of urinary calculi in an in vitro model using novel postimage acquisition data processing techniques. METHODS: Fifty urinary calculi were assessed, of which 44 had >or=60% composition of one component. DECT was performed utilizing 64-slice multidetector CT. The attenuation profiles of the lower-energy (DECT-Low) and higher-energy (DECT-High) datasets were used to investigate whether differences could be seen between different stone compositions. RESULTS: Postimage acquisition processing allowed for identification of the main different chemical compositions of urinary calculi: brushite, calcium oxalate-calcium phosphate, struvite, cystine, and uric acid. Statistical analysis demonstrated that this processing identified all stone compositions without obvious graphical overlap. CONCLUSION: Dual-energy multidetector CT with postprocessing techniques allows for accurate discrimination among the main different subtypes of urinary calculi in an in vitro model. The ability to better detect stone composition may have implications in determining the optimum clinical treatment modality for urinary calculi from noninvasive, preprocedure radiological assessment.
Resumo:
BACKGROUND: The nutrient-sensing Tor pathway governs cell growth and is conserved in nearly all eukaryotic organisms from unicellular yeasts to multicellular organisms, including humans. Tor is the target of the immunosuppressive drug rapamycin, which in complex with the prolyl isomerase FKBP12 inhibits Tor functions. Rapamycin is a gold standard drug for organ transplant recipients that was approved by the FDA in 1999 and is finding additional clinical indications as a chemotherapeutic and antiproliferative agent. Capitalizing on the plethora of recently sequenced genomes we have conducted comparative genomic studies to annotate the Tor pathway throughout the fungal kingdom and related unicellular opisthokonts, including Monosiga brevicollis, Salpingoeca rosetta, and Capsaspora owczarzaki. RESULTS: Interestingly, the Tor signaling cascade is absent in three microsporidian species with available genome sequences, the only known instance of a eukaryotic group lacking this conserved pathway. The microsporidia are obligate intracellular pathogens with highly reduced genomes, and we hypothesize that they lost the Tor pathway as they adapted and streamlined their genomes for intracellular growth in a nutrient-rich environment. Two TOR paralogs are present in several fungal species as a result of either a whole genome duplication or independent gene/segmental duplication events. One such event was identified in the amphibian pathogen Batrachochytrium dendrobatidis, a chytrid responsible for worldwide global amphibian declines and extinctions. CONCLUSIONS: The repeated independent duplications of the TOR gene in the fungal kingdom might reflect selective pressure acting upon this kinase that populates two proteinaceous complexes with different cellular roles. These comparative genomic analyses illustrate the evolutionary trajectory of a central nutrient-sensing cascade that enables diverse eukaryotic organisms to respond to their natural environments.
Resumo:
Illicit trade carries the potential to magnify existing tobacco-related health care costs through increased availability of untaxed and inexpensive cigarettes. What is known with respect to the magnitude of illicit trade for Vietnam is produced primarily by the industry, and methodologies are typically opaque. Independent assessment of the illicit cigarette trade in Vietnam is vital to tobacco control policy. This paper measures the magnitude of illicit cigarette trade for Vietnam between 1998 and 2010 using two methods, discrepancies between legitimate domestic cigarette sales and domestic tobacco consumption estimated from surveys, and trade discrepancies as recorded by Vietnam and trade partners. The results indicate that Vietnam likely experienced net smuggling in during the period studied. With the inclusion of adjustments for survey respondent under-reporting, inward illicit trade likely occurred in three of the four years for which surveys were available. Discrepancies in trade records indicate that the value of smuggled cigarettes into Vietnam ranges from $100 million to $300 million between 2000 and 2010 and that these cigarettes primarily originate in Singapore, Hong Kong, Macao, Malaysia, and Australia. Notable differences in trends over time exist between the two methods, but by comparison, the industry estimates consistently place the magnitude of illicit trade at the upper bounds of what this study shows. The unavailability of annual, survey-based estimates of consumption may obscure the true, annual trend over time. Second, as surveys changed over time, estimates relying on them may be inconsistent with one another. Finally, these two methods measure different components of illicit trade, specifically consumption of illicit cigarettes regardless of origin and smuggling of cigarettes into a particular market. However, absent a gold standard, comparisons of different approaches to illicit trade measurement serve efforts to refine and improve measurement approaches and estimates.
Resumo:
Background. Thoracic epidural catheters provide the best quality postoperative pain relief for major abdominal and thoracic surgical procedures, but placement is one of the most challenging procedures in the repertoire of an anesthesiologist. Most patients presenting for a procedure that would benefit from a thoracic epidural catheter have already had high resolution imaging that may be useful to assist placement of a catheter. Methods. This retrospective study used data from 168 patients to examine the association and predictive power of epidural-skin distance (ESD) on computed tomography (CT) to determine loss of resistance depth acquired during epidural placement. Additionally, the ability of anesthesiologists to measure this distance was compared to a radiologist, who specializes in spine imaging. Results. There was a strong association between CT measurement and loss of resistance depth (P < 0.0001); the presence of morbid obesity (BMI > 35) changed this relationship (P = 0.007). The ability of anesthesiologists to make CT measurements was similar to a gold standard radiologist (all individual ICCs > 0.9). Conclusions. Overall, this study supports the examination of a recent CT scan to aid in the placement of a thoracic epidural catheter. Making use of these scans may lead to faster epidural placements, fewer accidental dural punctures, and better epidural blockade.
Resumo:
CD8+ T cells are associated with long term control of virus replication to low or undetectable levels in a population of HIV+ therapy-naïve individuals known as virus controllers (VCs; <5000 RNA copies/ml and CD4+ lymphocyte counts >400 cells/µl). These subjects' ability to control viremia in the absence of therapy makes them the gold standard for the type of CD8+ T-cell response that should be induced with a vaccine. Studying the regulation of CD8+ T cells responses in these VCs provides the opportunity to discover mechanisms of durable control of HIV-1. Previous research has shown that the CD8+ T cell population in VCs is heterogeneous in its ability to inhibit virus replication and distinct T cells are responsible for virus inhibition. Further defining both the functional properties and regulation of the specific features of the select CD8+ T cells responsible for potent control of viremia the in VCs would enable better evaluation of T cell-directed vaccine strategies and may inform the design of new therapies.
Here we discuss the progress made in elucidating the features and regulation of CD8+ T cell response in virus controllers. We first detail the development of assays to quantify CD8+ T cells' ability to inhibit virus replication. This includes the use of a multi-clade HIV-1 panel which can subsequently be used as a tool for evaluation of T cell directed vaccines. We used these assays to evaluate the CD8+ response among cohorts of HIV-1 seronegative, HIV-1 acutely infected, and HIV-1 chronically infected (both VC and chronic viremic) patients. Contact and soluble CD8+ T cell virus inhibition assays (VIAs) are able to distinguish these patient groups based on the presence and magnitude of the responses. When employed in conjunction with peptide stimulation, the soluble assay reveals peptide stimulation induces CD8+ T cell responses with a prevalence of Gag p24 and Nef specificity among the virus controllers tested. Given this prevalence, we aimed to determine the gene expression profile of Gag p24-, Nef-, and unstimulated CD8+ T cells. RNA was isolated from CD8+ T-cells from two virus controllers with strong virus inhibition and one seronegative donor after a 5.5 hour stimulation period then analyzed using the Illumina Human BeadChip platform (Duke Center for Human Genome Variation). Analysis revealed that 565 (242 Nef and 323 Gag) genes were differentially expressed in CD8+ T-cells that were able to inhibit virus replication compared to those that could not. We compared the differentially expressed genes to published data sets from other CD8+ T-cell effector function experiments focusing our analysis on the most recurring genes with immunological, gene regulatory, apoptotic or unknown functions. The most commonly identified gene in these studies was TNFRSF9. Using PCR in a larger cohort of virus controllers we confirmed the up-regulation of TNFRSF9 in Gag p24 and Nef-specific CD8+ T cell mediated virus inhibition. We also observed increase in the mRNA encoding antiviral cytokines macrophage inflammatory proteins (MIP-1α, MIP-1αP, MIP-1β), interferon gamma (IFN-γ), granulocyte-macrophage colony-stimulating factor (GM-CSF), and recently identified lymphotactin (XCL1).
Our previous work suggests the CD8+ T-cell response to HIV-1 can be regulated at the level of gene regulation. Because RNA abundance is modulated by transcription of new mRNAs and decay of new and existing RNA we aimed to evaluate the net rate of transcription and mRNA decay for the cytokines we identified as differentially regulated. To estimate rate of mRNA synthesis and decay, we stimulated isolated CD8+ T-cells with Gag p24 and Nef peptides adding 4-thiouridine (4SU) during the final hour of stimulation, allowing for separation of RNA made during the final hour of stimulation. Subsequent PCR of RNA isolated from these cells, allowed us to determine how much mRNA was made for our genes of interest during the final hour which we used to calculate rate of transcription. To assess if stimulation caused a change in RNA stability, we calculated the decay rates of these mRNA over time. In Gag p24 and Nef stimulated T cells , the abundance of the mRNA of many of the cytokines examined was dependent on changes in both transcription and mRNA decay with evidence for potential differences in the regulation of mRNA between Nef and Gag specific CD8+ T cells. The results were highly reproducible in that in one subject that was measured in three independent experiments the results were concordant.
This data suggests that mRNA stability, in addition to transcription, is key in regulating the direct anti-HIV-1 function of antigen-specific memory CD8+ T cells by enabling rapid recall of anti-HIV-1 effector functions, namely the production and increased stability of antiviral cytokines. We have started to uncover the mechanisms employed by CD8+ T cell subsets with antigen-specific anti-HIV-1 activity, in turn, enhancing our ability to inhibit virus replication by informing both cure strategies and HIV-1 vaccine designs that aim to reduce transmission and can aid in blocking HIV-1 acquisition.
Resumo:
BACKGROUND: Administrative or quality improvement registries may or may not contain the elements needed for investigations by trauma researchers. International Classification of Diseases Program for Injury Categorisation (ICDPIC), a statistical program available through Stata, is a powerful tool that can extract injury severity scores from ICD-9-CM codes. We conducted a validation study for use of the ICDPIC in trauma research. METHODS: We conducted a retrospective cohort validation study of 40,418 patients with injury using a large regional trauma registry. ICDPIC-generated AIS scores for each body region were compared with trauma registry AIS scores (gold standard) in adult and paediatric populations. A separate analysis was conducted among patients with traumatic brain injury (TBI) comparing the ICDPIC tool with ICD-9-CM embedded severity codes. Performance in characterising overall injury severity, by the ISS, was also assessed. RESULTS: The ICDPIC tool generated substantial correlations in thoracic and abdominal trauma (weighted κ 0.87-0.92), and in head and neck trauma (weighted κ 0.76-0.83). The ICDPIC tool captured TBI severity better than ICD-9-CM code embedded severity and offered the advantage of generating a severity value for every patient (rather than having missing data). Its ability to produce an accurate severity score was consistent within each body region as well as overall. CONCLUSIONS: The ICDPIC tool performs well in classifying injury severity and is superior to ICD-9-CM embedded severity for TBI. Use of ICDPIC demonstrates substantial efficiency and may be a preferred tool in determining injury severity for large trauma datasets, provided researchers understand its limitations and take caution when examining smaller trauma datasets.
Resumo:
The growing exposure to chemicals in our environment and the increasing concern over their impact on health have elevated the need for new methods for surveying the detrimental effects of these compounds. Today's gold standard for assessing the effects of toxicants on the brain is based on hematoxylin and eosin (H&E)-stained histology, sometimes accompanied by special stains or immunohistochemistry for neural processes and myelin. This approach is time-consuming and is usually limited to a fraction of the total brain volume. We demonstrate that magnetic resonance histology (MRH) can be used for quantitatively assessing the effects of central nervous system toxicants in rat models. We show that subtle and sparse changes to brain structure can be detected using magnetic resonance histology, and correspond to some of the locations in which lesions are found by traditional pathological examination. We report for the first time diffusion tensor image-based detection of changes in white matter regions, including fimbria and corpus callosum, in the brains of rats exposed to 8 mg/kg and 12 mg/kg trimethyltin. Besides detecting brain-wide changes, magnetic resonance histology provides a quantitative assessment of dose-dependent effects. These effects can be found in different magnetic resonance contrast mechanisms, providing multivariate biomarkers for the same spatial location. In this study, deformation-based morphometry detected areas where previous studies have detected cell loss, while voxel-wise analyses of diffusion tensor parameters revealed microstructural changes due to such things as cellular swelling, apoptosis, and inflammation. Magnetic resonance histology brings a valuable addition to pathology with the ability to generate brain-wide quantitative parametric maps for markers of toxic insults in the rodent brain.
Resumo:
© 2015 Elsevier Inc. All rights reserved.Background 12-lead ECG is a critical component of initial evaluation of cardiac ischemia, but has traditionally been limited to large, dedicated equipment in medical care environments. Smartphones provide a potential alternative platform for the extension of ECG to new care settings and to improve timeliness of care. Objective To gain experience with smartphone electrocardiography prior to designing a larger multicenter study evaluating standard 12-lead ECG compared to smartphone ECG. Methods 6 patients for whom the hospital STEMI protocol was activated were evaluated with traditional 12-lead ECG followed immediately by a smartphone ECG using right (VnR) and left (VnL) limb leads for precordial grounding. The AliveCor™ Heart Monitor was utilized for this study. All tracings were taken prior to catheterization or immediately after revascularization while still in the catheterization laboratory. Results The smartphone ECG had excellent correlation with the gold standard 12-lead ECG in all patients. Four out of six tracings were judged to meet STEMI criteria on both modalities as determined by three experienced cardiologists, and in the remaining two, consensus indicated a non-STEMI ECG diagnosis. No significant difference was noted between VnR and VnL. Conclusions Smartphone based electrocardiography is a promising, developing technology intended to increase availability and speed of electrocardiographic evaluation. This study confirmed the potential of a smartphone ECG for evaluation of acute ischemia and the feasibility of studying this technology further to define the diagnostic accuracy, limitations and appropriate use of this new technology.
Resumo:
X-ray mammography has been the gold standard for breast imaging for decades, despite the significant limitations posed by the two dimensional (2D) image acquisitions. Difficulty in diagnosing lesions close to the chest wall and axilla, high amount of structural overlap and patient discomfort due to compression are only some of these limitations. To overcome these drawbacks, three dimensional (3D) breast imaging modalities have been developed including dual modality single photon emission computed tomography (SPECT) and computed tomography (CT) systems. This thesis focuses on the development and integration of the next generation of such a device for dedicated breast imaging. The goals of this dissertation work are to: [1] understand and characterize any effects of fully 3-D trajectories on reconstructed image scatter correction, absorbed dose and Hounsifeld Unit accuracy, and [2] design, develop and implement the fully flexible, third generation hybrid SPECT-CT system capable of traversing complex 3D orbits about a pendant breast volume, without interference from the other. Such a system would overcome artifacts resulting from incompletely sampled divergent cone beam imaging schemes and allow imaging closer to the chest wall, which other systems currently under research and development elsewhere cannot achieve.
The dependence of x-ray scatter radiation on object shape, size, material composition and the CT acquisition trajectory, was investigated with a well-established beam stop array (BSA) scatter correction method. While the 2D scatter to primary ratio (SPR) was the main metric used to characterize total system scatter, a new metric called ‘normalized scatter contribution’ was developed to compare the results of scatter correction on 3D reconstructed volumes. Scatter estimation studies were undertaken with a sinusoidal saddle (±15° polar tilt) orbit and a traditional circular (AZOR) orbit. Clinical studies to acquire data for scatter correction were used to evaluate the 2D SPR on a small set of patients scanned with the AZOR orbit. Clinical SPR results showed clear dependence of scatter on breast composition and glandular tissue distribution, otherwise consistent with the overall phantom-based size and density measurements. Additionally, SPR dependence was also observed on the acquisition trajectory where 2D scatter increased with an increase in the polar tilt angle of the system.
The dose delivered by any imaging system is of primary importance from the patient’s point of view, and therefore trajectory related differences in the dose distribution in a target volume were evaluated. Monte Carlo simulations as well as physical measurements using radiochromic film were undertaken using saddle and AZOR orbits. Results illustrated that both orbits deliver comparable dose to the target volume, and only slightly differ in distribution within the volume. Simulations and measurements showed similar results, and all measured dose values were within the standard screening mammography-specific, 6 mGy dose limit, which is used as a benchmark for dose comparisons.
Hounsfield Units (HU) are used clinically in differentiating tissue types in a reconstructed CT image, and therefore the HU accuracy of a system is very important, especially when using non-traditional trajectories. Uniform phantoms filled with various uniform density fluids were used to investigate differences in HU accuracy between saddle and AZOR orbits. Results illustrate the considerably better performance of the saddle orbit, especially close to the chest and nipple region of what would clinically be a pedant breast volume. The AZOR orbit causes shading artifacts near the nipple, due to insufficient sampling, rendering a major portion of the scanned phantom unusable, whereas the saddle orbit performs exceptionally well and provides a tighter distribution of HU values in reconstructed volumes.
Finally, the third generation, fully-suspended SPECT-CT system was designed in and developed in our lab. A novel mechanical method using a linear motor was developed for tilting the CT system. A new x-ray source and a custom made 40 x 30 cm2 detector were integrated on to this system. The SPECT system was nested, in the center of the gantry, orthogonal to the CT source-detector pair. The SPECT system tilts on a goniometer, and the newly developed CT tilting mechanism allows ±15° maximum polar tilting of the CT system. The entire gantry is mounted on a rotation stage, allowing complex arbitrary trajectories for each system, without interference from the other, while having a common field of view. This hybrid system shows potential to be used clinically as a diagnostic tool for dedicated breast imaging.
Resumo:
Abstract
The goal of modern radiotherapy is to precisely deliver a prescribed radiation dose to delineated target volumes that contain a significant amount of tumor cells while sparing the surrounding healthy tissues/organs. Precise delineation of treatment and avoidance volumes is the key for the precision radiation therapy. In recent years, considerable clinical and research efforts have been devoted to integrate MRI into radiotherapy workflow motivated by the superior soft tissue contrast and functional imaging possibility. Dynamic contrast-enhanced MRI (DCE-MRI) is a noninvasive technique that measures properties of tissue microvasculature. Its sensitivity to radiation-induced vascular pharmacokinetic (PK) changes has been preliminary demonstrated. In spite of its great potential, two major challenges have limited DCE-MRI’s clinical application in radiotherapy assessment: the technical limitations of accurate DCE-MRI imaging implementation and the need of novel DCE-MRI data analysis methods for richer functional heterogeneity information.
This study aims at improving current DCE-MRI techniques and developing new DCE-MRI analysis methods for particular radiotherapy assessment. Thus, the study is naturally divided into two parts. The first part focuses on DCE-MRI temporal resolution as one of the key DCE-MRI technical factors, and some improvements regarding DCE-MRI temporal resolution are proposed; the second part explores the potential value of image heterogeneity analysis and multiple PK model combination for therapeutic response assessment, and several novel DCE-MRI data analysis methods are developed.
I. Improvement of DCE-MRI temporal resolution. First, the feasibility of improving DCE-MRI temporal resolution via image undersampling was studied. Specifically, a novel MR image iterative reconstruction algorithm was studied for DCE-MRI reconstruction. This algorithm was built on the recently developed compress sensing (CS) theory. By utilizing a limited k-space acquisition with shorter imaging time, images can be reconstructed in an iterative fashion under the regularization of a newly proposed total generalized variation (TGV) penalty term. In the retrospective study of brain radiosurgery patient DCE-MRI scans under IRB-approval, the clinically obtained image data was selected as reference data, and the simulated accelerated k-space acquisition was generated via undersampling the reference image full k-space with designed sampling grids. Two undersampling strategies were proposed: 1) a radial multi-ray grid with a special angular distribution was adopted to sample each slice of the full k-space; 2) a Cartesian random sampling grid series with spatiotemporal constraints from adjacent frames was adopted to sample the dynamic k-space series at a slice location. Two sets of PK parameters’ maps were generated from the undersampled data and from the fully-sampled data, respectively. Multiple quantitative measurements and statistical studies were performed to evaluate the accuracy of PK maps generated from the undersampled data in reference to the PK maps generated from the fully-sampled data. Results showed that at a simulated acceleration factor of four, PK maps could be faithfully calculated from the DCE images that were reconstructed using undersampled data, and no statistically significant differences were found between the regional PK mean values from undersampled and fully-sampled data sets. DCE-MRI acceleration using the investigated image reconstruction method has been suggested as feasible and promising.
Second, for high temporal resolution DCE-MRI, a new PK model fitting method was developed to solve PK parameters for better calculation accuracy and efficiency. This method is based on a derivative-based deformation of the commonly used Tofts PK model, which is presented as an integrative expression. This method also includes an advanced Kolmogorov-Zurbenko (KZ) filter to remove the potential noise effect in data and solve the PK parameter as a linear problem in matrix format. In the computer simulation study, PK parameters representing typical intracranial values were selected as references to simulated DCE-MRI data for different temporal resolution and different data noise level. Results showed that at both high temporal resolutions (<1s) and clinically feasible temporal resolution (~5s), this new method was able to calculate PK parameters more accurate than the current calculation methods at clinically relevant noise levels; at high temporal resolutions, the calculation efficiency of this new method was superior to current methods in an order of 102. In a retrospective of clinical brain DCE-MRI scans, the PK maps derived from the proposed method were comparable with the results from current methods. Based on these results, it can be concluded that this new method can be used for accurate and efficient PK model fitting for high temporal resolution DCE-MRI.
II. Development of DCE-MRI analysis methods for therapeutic response assessment. This part aims at methodology developments in two approaches. The first one is to develop model-free analysis method for DCE-MRI functional heterogeneity evaluation. This approach is inspired by the rationale that radiotherapy-induced functional change could be heterogeneous across the treatment area. The first effort was spent on a translational investigation of classic fractal dimension theory for DCE-MRI therapeutic response assessment. In a small-animal anti-angiogenesis drug therapy experiment, the randomly assigned treatment/control groups received multiple fraction treatments with one pre-treatment and multiple post-treatment high spatiotemporal DCE-MRI scans. In the post-treatment scan two weeks after the start, the investigated Rényi dimensions of the classic PK rate constant map demonstrated significant differences between the treatment and the control groups; when Rényi dimensions were adopted for treatment/control group classification, the achieved accuracy was higher than the accuracy from using conventional PK parameter statistics. Following this pilot work, two novel texture analysis methods were proposed. First, a new technique called Gray Level Local Power Matrix (GLLPM) was developed. It intends to solve the lack of temporal information and poor calculation efficiency of the commonly used Gray Level Co-Occurrence Matrix (GLCOM) techniques. In the same small animal experiment, the dynamic curves of Haralick texture features derived from the GLLPM had an overall better performance than the corresponding curves derived from current GLCOM techniques in treatment/control separation and classification. The second developed method is dynamic Fractal Signature Dissimilarity (FSD) analysis. Inspired by the classic fractal dimension theory, this method measures the dynamics of tumor heterogeneity during the contrast agent uptake in a quantitative fashion on DCE images. In the small animal experiment mentioned before, the selected parameters from dynamic FSD analysis showed significant differences between treatment/control groups as early as after 1 treatment fraction; in contrast, metrics from conventional PK analysis showed significant differences only after 3 treatment fractions. When using dynamic FSD parameters, the treatment/control group classification after 1st treatment fraction was improved than using conventional PK statistics. These results suggest the promising application of this novel method for capturing early therapeutic response.
The second approach of developing novel DCE-MRI methods is to combine PK information from multiple PK models. Currently, the classic Tofts model or its alternative version has been widely adopted for DCE-MRI analysis as a gold-standard approach for therapeutic response assessment. Previously, a shutter-speed (SS) model was proposed to incorporate transcytolemmal water exchange effect into contrast agent concentration quantification. In spite of richer biological assumption, its application in therapeutic response assessment is limited. It might be intriguing to combine the information from the SS model and from the classic Tofts model to explore potential new biological information for treatment assessment. The feasibility of this idea was investigated in the same small animal experiment. The SS model was compared against the Tofts model for therapeutic response assessment using PK parameter regional mean value comparison. Based on the modeled transcytolemmal water exchange rate, a biological subvolume was proposed and was automatically identified using histogram analysis. Within the biological subvolume, the PK rate constant derived from the SS model were proved to be superior to the one from Tofts model in treatment/control separation and classification. Furthermore, novel biomarkers were designed to integrate PK rate constants from these two models. When being evaluated in the biological subvolume, this biomarker was able to reflect significant treatment/control difference in both post-treatment evaluation. These results confirm the potential value of SS model as well as its combination with Tofts model for therapeutic response assessment.
In summary, this study addressed two problems of DCE-MRI application in radiotherapy assessment. In the first part, a method of accelerating DCE-MRI acquisition for better temporal resolution was investigated, and a novel PK model fitting algorithm was proposed for high temporal resolution DCE-MRI. In the second part, two model-free texture analysis methods and a multiple-model analysis method were developed for DCE-MRI therapeutic response assessment. The presented works could benefit the future DCE-MRI routine clinical application in radiotherapy assessment.
Resumo:
Purpose
The objective of our study was to test a new approach to approximating organ dose by using the effective energy of the combined 80kV/140kV beam used in fast kV switch dual-energy (DE) computed tomography (CT). The two primary focuses of the study were to first validate experimentally the dose equivalency between MOSFET and ion chamber (as a gold standard) in a fast kV switch DE environment, and secondly to estimate effective dose (ED) of DECT scans using MOSFET detectors and an anthropomorphic phantom.
Materials and Methods
A GE Discovery 750 CT scanner was employed using a fast-kV switch abdomen/pelvis protocol alternating between 80 kV and 140 kV. The specific aims of our study were to (1) Characterize the effective energy of the dual energy environment; (2) Estimate the f-factor for soft tissue; (3) Calibrate the MOSFET detectors using a beam with effective energy equal to the combined DE environment; (4) Validate our calibration by using MOSFET detectors and ion chamber to measure dose at the center of a CTDI body phantom; (5) Measure ED for an abdomen/pelvis scan using an anthropomorphic phantom and applying ICRP 103 tissue weighting factors; and (6) Estimate ED using AAPM Dose Length Product (DLP) method. The effective energy of the combined beam was calculated by measuring dose with an ion chamber under varying thicknesses of aluminum to determine half-value layer (HVL).
Results
The effective energy of the combined dual-energy beams was found to be 42.8 kV. After calibration, tissue dose in the center of the CTDI body phantom was measured at 1.71 ± 0.01 cGy using an ion chamber, and 1.73±0.04 and 1.69±0.09 using two separate MOSFET detectors. This result showed a -0.93% and 1.40 % difference, respectively, between ion chamber and MOSFET. ED from the dual-energy scan was calculated as 16.49 ± 0.04 mSv by the MOSFET method and 14.62 mSv by the DLP method.
Resumo:
Background: Depression-screening tools exist and are widely used in Western settings. There have been few studies done to explore whether or not existing tools are valid and effective to use in sub-Saharan Africa. Our study aimed to develop and validate a perinatal depression-screening tool in rural Kenya.
Methods: We utilized conducted free listing and card sorting exercises with a purposive sample of 12 women and 38 CHVs living in a rural community to explore the manifestations of perinatal depression in that setting. We used the information obtained to produce a locally relevant depression-screening tool that comprised of existing Western psychiatric concepts and locally derived items. Subsequently, we administered the novel depression-screening tool and two existing screening tools (the Edinburgh Postnatal Depression Scale and the Patient Health Questionnaire-9) to 193 women and compared the results of the screening tool with that of a gold standard structured clinical interview to determine validity.
Results: The free listing and card sorting exercise produced a set of 60 screening items. Of the items in this set, we identified the 10 items that most accurately classified cases and non-cases. This 10-item scale had a sensitivity of 100.0 and specificity of 81.2. This compared to 90.0, 31.5 and 90.0, 49.7 for the EPDS and the PHQ-9, respectively. Overall, we found a prevalence of depression of 5.2 percent.
Conclusions: The new scale does very well in terms of diagnostic validity, having the highest scores in this domain compared to the EPDS, EPDS-R and PHQ-9. The adapted scale does very well with regards to convergent validity-illustrating clear distinction between mean scores across the different categories. It does well with regards to discriminant validity, internal consistency reliability, and test-retest reliability- not securing top scores in those domains but still yielding satisfactory results.
Resumo:
Surveys can collect important data that inform policy decisions and drive social science research. Large government surveys collect information from the U.S. population on a wide range of topics, including demographics, education, employment, and lifestyle. Analysis of survey data presents unique challenges. In particular, one needs to account for missing data, for complex sampling designs, and for measurement error. Conceptually, a survey organization could spend lots of resources getting high-quality responses from a simple random sample, resulting in survey data that are easy to analyze. However, this scenario often is not realistic. To address these practical issues, survey organizations can leverage the information available from other sources of data. For example, in longitudinal studies that suffer from attrition, they can use the information from refreshment samples to correct for potential attrition bias. They can use information from known marginal distributions or survey design to improve inferences. They can use information from gold standard sources to correct for measurement error.
This thesis presents novel approaches to combining information from multiple sources that address the three problems described above.
The first method addresses nonignorable unit nonresponse and attrition in a panel survey with a refreshment sample. Panel surveys typically suffer from attrition, which can lead to biased inference when basing analysis only on cases that complete all waves of the panel. Unfortunately, the panel data alone cannot inform the extent of the bias due to attrition, so analysts must make strong and untestable assumptions about the missing data mechanism. Many panel studies also include refreshment samples, which are data collected from a random sample of new
individuals during some later wave of the panel. Refreshment samples offer information that can be utilized to correct for biases induced by nonignorable attrition while reducing reliance on strong assumptions about the attrition process. To date, these bias correction methods have not dealt with two key practical issues in panel studies: unit nonresponse in the initial wave of the panel and in the
refreshment sample itself. As we illustrate, nonignorable unit nonresponse
can significantly compromise the analyst's ability to use the refreshment samples for attrition bias correction. Thus, it is crucial for analysts to assess how sensitive their inferences---corrected for panel attrition---are to different assumptions about the nature of the unit nonresponse. We present an approach that facilitates such sensitivity analyses, both for suspected nonignorable unit nonresponse
in the initial wave and in the refreshment sample. We illustrate the approach using simulation studies and an analysis of data from the 2007-2008 Associated Press/Yahoo News election panel study.
The second method incorporates informative prior beliefs about
marginal probabilities into Bayesian latent class models for categorical data.
The basic idea is to append synthetic observations to the original data such that
(i) the empirical distributions of the desired margins match those of the prior beliefs, and (ii) the values of the remaining variables are left missing. The degree of prior uncertainty is controlled by the number of augmented records. Posterior inferences can be obtained via typical MCMC algorithms for latent class models, tailored to deal efficiently with the missing values in the concatenated data.
We illustrate the approach using a variety of simulations based on data from the American Community Survey, including an example of how augmented records can be used to fit latent class models to data from stratified samples.
The third method leverages the information from a gold standard survey to model reporting error. Survey data are subject to reporting error when respondents misunderstand the question or accidentally select the wrong response. Sometimes survey respondents knowingly select the wrong response, for example, by reporting a higher level of education than they actually have attained. We present an approach that allows an analyst to model reporting error by incorporating information from a gold standard survey. The analyst can specify various reporting error models and assess how sensitive their conclusions are to different assumptions about the reporting error process. We illustrate the approach using simulations based on data from the 1993 National Survey of College Graduates. We use the method to impute error-corrected educational attainments in the 2010 American Community Survey using the 2010 National Survey of College Graduates as the gold standard survey.
Resumo:
Knowing one's HIV status is particularly important in the setting of recent tuberculosis (TB) exposure. Blood tests for assessment of tuberculosis infection, such as the QuantiFERON Gold in-tube test (QFT; Cellestis Limited, Carnegie, Victoria, Australia), offer the possibility of simultaneous screening for TB and HIV with a single blood draw. We performed a cross-sectional analysis of all contacts to a highly infectious TB case in a large meatpacking factory. Twenty-two percent were foreign-born and 73% were black. Contacts were tested with both tuberculin skin testing (TST) and QFT. HIV testing was offered on an opt-out basis. Persons with TST >or=10 mm, positive QFT, and/or positive HIV test were offered latent TB treatment. Three hundred twenty-six contacts were screened: TST results were available for 266 people and an additional 24 reported a prior positive TST for a total of 290 persons with any TST result (89.0%). Adequate QFT specimens were obtained for 312 (95.7%) of persons. Thirty-two persons had QFT results but did not return for TST reading. Twenty-two percent met the criteria for latent TB infection. Eighty-eight percent accepted HIV testing. Two (0.7%) were HIV seropositive; both individuals were already aware of their HIV status, but one had stopped care a year previously. None of the HIV-seropositive persons had latent TB, but all were offered latent TB treatment per standard guidelines. This demonstrates that opt-out HIV testing combined with QFT in a large TB contact investigation was feasible and useful. HIV testing was also widely accepted. Pairing QFT with opt-out HIV testing should be strongly considered when possible.