7 resultados para Advent.
em Duke University
Resumo:
Since at least the early 1990s, stage and risk migration have been seen in patients with prostate cancer, likely corresponding to the institution of prostate specific antigen (PSA) screening in health systems. Preoperative risk factors, including PSA level and clinical stage, have decreased significantly. These improved prognostic variables have led to a larger portion of men being stratified with low-risk disease, as per the classification of D'Amico and associates. This, in turn, has corresponded with more favorable postoperative variables, including decreased extraprostatic tumor extension and prolonged biochemical-free recurrence rates. The advent of focal therapy is bolstered by findings of increased unilateral disease with decreased tumor volume. Increasingly, targeted or delayed therapies may be possible within the current era of lower risk disease.
Resumo:
The advent of digital microfluidic lab-on-a-chip (LoC) technology offers a platform for developing diagnostic applications with the advantages of portability, reduction of the volumes of the sample and reagents, faster analysis times, increased automation, low power consumption, compatibility with mass manufacturing, and high throughput. Moreover, digital microfluidics is being applied in other areas such as airborne chemical detection, DNA sequencing by synthesis, and tissue engineering. In most diagnostic and chemical-detection applications, a key challenge is the preparation of the analyte for presentation to the on-chip detection system. Thus, in diagnostics, raw physiological samples must be introduced onto the chip and then further processed by lysing blood cells and extracting DNA. For massively parallel DNA sequencing, sample preparation can be performed off chip, but the synthesis steps must be performed in a sequential on-chip format by automated control of buffers and nucleotides to extend the read lengths of DNA fragments. In airborne particulate-sampling applications, the sample collection from an air stream must be integrated into the LoC analytical component, which requires a collection droplet to scan an exposed impacted surface after its introduction into a closed analytical section. Finally, in tissue-engineering applications, the challenge for LoC technology is to build high-resolution (less than 10 microns) 3D tissue constructs with embedded cells and growth factors by manipulating and maintaining live cells in the chip platform. This article discusses these applications and their implementation in digital-microfluidic LoC platforms. © 2007 IEEE.
Resumo:
High-throughput analysis of animal behavior requires software to analyze videos. Such software typically depends on the experiments' being performed in good lighting conditions, but this ideal is difficult or impossible to achieve for certain classes of experiments. Here, we describe techniques that allow long-duration positional tracking in difficult lighting conditions with strong shadows or recurring "on"/"off" changes in lighting. The latter condition will likely become increasingly common, e.g., for Drosophila due to the advent of red-shifted channel rhodopsins. The techniques enabled tracking with good accuracy in three types of experiments with difficult lighting conditions in our lab. Our technique handling shadows relies on single-animal tracking and on shadows' and flies' being accurately distinguishable by distance to the center of the arena (or a similar geometric rule); the other techniques should be broadly applicable. We implemented the techniques as extensions of the widely-used tracking software Ctrax; however, they are relatively simple, not specific to Drosophila, and could be added to other trackers as well.
Resumo:
Foundational cellular immunology research of the 1960s and 1970s, together with the advent of monoclonal antibodies and flow cytometry, provided the knowledge base and the technological capability that enabled the elucidation of the role of CD4 T cells in HIV infection. Research identifying the sources and magnitude of variation in CD4 measurements, standardized reagents and protocols, and the development of clinical flow cytometers all contributed to the feasibility of widespread CD4 testing. Cohort studies and clinical trials provided the context for establishing the utility of CD4 for prognosis in HIV-infected persons, initial assessment of in vivo antiretroviral drug activity, and as a surrogate marker for clinical outcome in antiretroviral therapeutic trials. Even with sensitive HIV viral load measurement, CD4 cell counting is still utilized in determining antiretroviral therapy eligibility and time to initiate therapy. New point of care technologies are helping both to lower the cost of CD4 testing and enable its use in HIV test and treat programs around the world.
Resumo:
'Image volumes' refer to realizations of images in other dimensions such as time, spectrum, and focus. Recent advances in scientific, medical, and consumer applications demand improvements in image volume capture. Though image volume acquisition continues to advance, it maintains the same sampling mechanisms that have been used for decades; every voxel must be scanned and is presumed independent of its neighbors. Under these conditions, improving performance comes at the cost of increased system complexity, data rates, and power consumption.
This dissertation explores systems and methods capable of efficiently improving sensitivity and performance for image volume cameras, and specifically proposes several sampling strategies that utilize temporal coding to improve imaging system performance and enhance our awareness for a variety of dynamic applications.
Video cameras and camcorders sample the video volume (x,y,t) at fixed intervals to gain understanding of the volume's temporal evolution. Conventionally, one must reduce the spatial resolution to increase the framerate of such cameras. Using temporal coding via physical translation of an optical element known as a coded aperture, the compressive temporal imaging (CACTI) camera emonstrates a method which which to embed the temporal dimension of the video volume into spatial (x,y) measurements, thereby greatly improving temporal resolution with minimal loss of spatial resolution. This technique, which is among a family of compressive sampling strategies developed at Duke University, temporally codes the exposure readout functions at the pixel level.
Since video cameras nominally integrate the remaining image volume dimensions (e.g. spectrum and focus) at capture time, spectral (x,y,t,\lambda) and focal (x,y,t,z) image volumes are traditionally captured via sequential changes to the spectral and focal state of the system, respectively. The CACTI camera's ability to embed video volumes into images leads to exploration of other information within that video; namely, focal and spectral information. The next part of the thesis demonstrates derivative works of CACTI: compressive extended depth of field and compressive spectral-temporal imaging. These works successfully show the technique's extension of temporal coding to improve sensing performance in these other dimensions.
Geometrical optics-related tradeoffs, such as the classic challenges of wide-field-of-view and high resolution photography, have motivated the development of mulitscale camera arrays. The advent of such designs less than a decade ago heralds a new era of research- and engineering-related challenges. One significant challenge is that of managing the focal volume (x,y,z) over wide fields of view and resolutions. The fourth chapter shows advances on focus and image quality assessment for a class of multiscale gigapixel cameras developed at Duke.
Along the same line of work, we have explored methods for dynamic and adaptive addressing of focus via point spread function engineering. We demonstrate another form of temporal coding in the form of physical translation of the image plane from its nominal focal position. We demonstrate this technique's capability to generate arbitrary point spread functions.
Resumo:
The rise of the twenty-first century has seen the further increase in the industrialization of Earth’s resources, as society aims to meet the needs of a growing population while still protecting our environmental and natural resources. The advent of the industrial bioeconomy – which encompasses the production of renewable biological resources and their conversion into food, feed, and bio-based products – is seen as an important step in transition towards sustainable development and away from fossil fuels. One sector of the industrial bioeconomy which is rapidly being expanded is the use of biobased feedstocks in electricity production as an alternative to coal, especially in the European Union.
As bioeconomy policies and objectives increasingly appear on political agendas, there is a growing need to quantify the impacts of transitioning from fossil fuel-based feedstocks to renewable biological feedstocks. Specifically, there is a growing need to conduct a systems analysis and potential risks of increasing the industrial bioeconomy, given that the flows within it are inextricably linked. Furthermore, greater analysis is needed into the consequences of shifting from fossil fuels to renewable feedstocks, in part through the use of life cycle assessment modeling to analyze impacts along the entire value chain.
To assess the emerging nature of the industrial bioeconomy, three objectives are addressed: (1) quantify the global industrial bioeconomy, linking the use of primary resources with the ultimate end product; (2) quantify the impacts of the expaning wood pellet energy export market of the Southeastern United States; (3) conduct a comparative life cycle assessment, incorporating the use of dynamic life cycle assessment, of replacing coal-fired electricity generation in the United Kingdom with wood pellets that are produced in the Southeastern United States.
To quantify the emergent industrial bioeconomy, an empirical analysis was undertaken. Existing databases from multiple domestic and international agencies was aggregated and analyzed in Microsoft Excel to produce a harmonized dataset of the bioeconomy. First-person interviews, existing academic literature, and industry reports were then utilized to delineate the various intermediate and end use flows within the bioeconomy. The results indicate that within a decade, the industrial use of agriculture has risen ten percent, given increases in the production of bioenergy and bioproducts. The underlying resources supporting the emergent bioeconomy (i.e., land, water, and fertilizer use) were also quantified and included in the database.
Following the quantification of the existing bioeconomy, an in-depth analysis of the bioenergy sector was conducted. Specifically, the focus was on quantifying the impacts of the emergent wood pellet export sector that has rapidly developed in recent years in the Southeastern United States. A cradle-to-gate life cycle assessment was conducted in order to quantify supply chain impacts from two wood pellet production scenarios: roundwood and sawmill residues. For reach of the nine impact categories assessed, wood pellet production from sawmill residues resulted in higher values, ranging from 10-31% higher.
The analysis of the wood pellet sector was then expanded to include the full life cycle (i.e., cradle-to-grave). In doing to, the combustion of biogenic carbon and the subsequent timing of emissions were assessed by incorporating dynamic life cycle assessment modeling. Assuming immediate carbon neutrality of the biomass, the results indicated an 86% reduction in global warming potential when utilizing wood pellets as compared to coal for electricity production in the United Kingdom. When incorporating the timing of emissions, wood pellets equated to a 75% or 96% reduction in carbon dioxide emissions, depending upon whether the forestry feedstock was considered to be harvested or planted in year one, respectively.
Finally, a policy analysis of renewable energy in the United States was conducted. Existing coal-fired power plants in the Southeastern United States were assessed in terms of incorporating the co-firing of wood pellets. Co-firing wood pellets with coal in existing Southeastern United States power stations would result in a nine percent reduction in global warming potential.
Resumo:
The advent of next-generation sequencing, now nearing a decade in age, has enabled, among other capabilities, measurement of genome-wide sequence features at unprecedented scale and resolution.
In this dissertation, I describe work to understand the genetic underpinnings of non-Hodgkin’s lymphoma through exploration of the epigenetics of its cell of origin, initial characterization and interpretation of driver mutations, and finally, a larger-scale, population-level study that incorporates mutation interpretation with clinical outcome.
In the first research chapter, I describe genomic characteristics of lymphomas through the lens of their cells of origin. Just as many other cancers, such as breast cancer or lung cancer, are categorized based on their cell of origin, lymphoma subtypes can be examined through the context of their normal B Cells of origin, Naïve, Germinal Center, and post-Germinal Center. By applying integrative analysis of the epigenetics of normal B Cells of origin through chromatin-immunoprecipitation sequencing, we find that differences in normal B Cell subtypes are reflected in the mutational landscapes of the cancers that arise from them, namely Mantle Cell, Burkitt, and Diffuse Large B-Cell Lymphoma.
In the next research chapter, I describe our first endeavor into understanding the genetic heterogeneity of Diffuse Large B Cell Lymphoma, the most common form of non-Hodgkin’s lymphoma, which affects 100,000 patients in the world. Through whole-genome sequencing of 1 case as well as whole-exome sequencing of 94 cases, we characterize the most recurrent genetic features of DLBCL and lay the groundwork for a larger study.
In the last research chapter, I describe work to characterize and interpret the whole exomes of 1001 cases of DLBCL in the largest single-cancer study to date. This highly-powered study enabled sub-gene, gene-level, and gene-network level understanding of driver mutations within DLBCL. Moreover, matched genomic and clinical data enabled the connection of these driver mutations to clinical features such as treatment response or overall survival. As sequencing costs continue to drop, whole-exome sequencing will become a routine clinical assay, and another diagnostic dimension in addition to existing methods such as histology. However, to unlock the full utility of sequencing data, we must be able to interpret it. This study undertakes a first step in developing the understanding necessary to uncover the genomic signals of DLBCL hidden within its exomes. However, beyond the scope of this one disease, the experimental and analytical methods can be readily applied to other cancer sequencing studies.
Thus, this dissertation leverages next-generation sequencing analysis to understand the genetic underpinnings of lymphoma, both by examining its normal cells of origin as well as through a large-scale study to sensitively identify recurrently mutated genes and their relationship to clinical outcome.