12 resultados para Kähler Metrics

em Duke University


Relevância:

20.00% 20.00%

Publicador:

Resumo:

As many as 20-70% of patients undergoing breast conserving surgery require repeat surgeries due to a close or positive surgical margin diagnosed post-operatively [1]. Currently there are no widely accepted tools for intra-operative margin assessment which is a significant unmet clinical need. Our group has developed a first-generation optical visible spectral imaging platform to image the molecular composition of breast tumor margins and has tested it clinically in 48 patients in a previously published study [2]. The goal of this paper is to report on the performance metrics of the system and compare it to clinical criteria for intra-operative tumor margin assessment. The system was found to have an average signal to noise ratio (SNR) >100 and <15% error in the extraction of optical properties indicating that there is sufficient SNR to leverage the differences in optical properties between negative and close/positive margins. The probe had a sensing depth of 0.5-2.2 mm over the wavelength range of 450-600 nm which is consistent with the pathologic criterion for clear margins of 0-2 mm. There was <1% cross-talk between adjacent channels of the multi-channel probe which shows that multiple sites can be measured simultaneously with negligible cross-talk between adjacent sites. Lastly, the system and measurement procedure were found to be reproducible when evaluated with repeated measures, with a low coefficient of variation (<0.11). The only aspect of the system not optimized for intra-operative use was the imaging time. The manuscript includes a discussion of how the speed of the system can be improved to work within the time constraints of an intra-operative setting.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

I discuss geometry and normal forms for pseudo-Riemannian metrics with parallel spinor fields in some interesting dimensions. I also discuss the interaction of these conditions for parallel spinor fields with the condition that the Ricci tensor vanish (which, for pseudo-Riemannian manifolds, is not an automatic consequence of the existence of a nontrivial parallel spinor field).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The central idea of this dissertation is to interpret certain invariants constructed from Laplace spectral data on a compact Riemannian manifold as regularized integrals of closed differential forms on the space of Riemannian metrics, or more generally on a space of metrics on a vector bundle. We apply this idea to both the Ray-Singer analytic torsion

and the eta invariant, explaining their dependence on the metric used to define them with a Stokes' theorem argument. We also introduce analytic multi-torsion, a generalization of analytic torsion, in the context of certain manifolds with local product structure; we prove that it is metric independent in a suitable sense.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

As more diagnostic testing options become available to physicians, it becomes more difficult to combine various types of medical information together in order to optimize the overall diagnosis. To improve diagnostic performance, here we introduce an approach to optimize a decision-fusion technique to combine heterogeneous information, such as from different modalities, feature categories, or institutions. For classifier comparison we used two performance metrics: The receiving operator characteristic (ROC) area under the curve [area under the ROC curve (AUC)] and the normalized partial area under the curve (pAUC). This study used four classifiers: Linear discriminant analysis (LDA), artificial neural network (ANN), and two variants of our decision-fusion technique, AUC-optimized (DF-A) and pAUC-optimized (DF-P) decision fusion. We applied each of these classifiers with 100-fold cross-validation to two heterogeneous breast cancer data sets: One of mass lesion features and a much more challenging one of microcalcification lesion features. For the calcification data set, DF-A outperformed the other classifiers in terms of AUC (p < 0.02) and achieved AUC=0.85 +/- 0.01. The DF-P surpassed the other classifiers in terms of pAUC (p < 0.01) and reached pAUC=0.38 +/- 0.02. For the mass data set, DF-A outperformed both the ANN and the LDA (p < 0.04) and achieved AUC=0.94 +/- 0.01. Although for this data set there were no statistically significant differences among the classifiers' pAUC values (pAUC=0.57 +/- 0.07 to 0.67 +/- 0.05, p > 0.10), the DF-P did significantly improve specificity versus the LDA at both 98% and 100% sensitivity (p < 0.04). In conclusion, decision fusion directly optimized clinically significant performance measures, such as AUC and pAUC, and sometimes outperformed two well-known machine-learning techniques when applied to two different breast cancer data sets.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper analyzes a class of common-component allocation rules, termed no-holdback (NHB) rules, in continuous-review assemble-to-order (ATO) systems with positive lead times. The inventory of each component is replenished following an independent base-stock policy. In contrast to the usually assumed first-come-first-served (FCFS) component allocation rule in the literature, an NHB rule allocates a component to a product demand only if it will yield immediate fulfillment of that demand. We identify metrics as well as cost and product structures under which NHB rules outperform all other component allocation rules. For systems with certain product structures, we obtain key performance expressions and compare them to those under FCFS. For general product structures, we present performance bounds and approximations. Finally, we discuss the applicability of these results to more general ATO systems. © 2010 INFORMS.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Nations around the world are considering strategies to mitigate the severe impacts of climate change predicted to occur in the twenty-first century. Many countries, however, lack the wealth, technology, and government institutions to effectively cope with climate change. This study investigates the varying degrees to which developing and developed nations will be exposed to changes in three key variables: temperature, precipitation, and runoff. We use Geographic Information Systems (GIS) analysis to compare current and future climate model predictions on a country level. We then compare our calculations of climate change exposure for each nation to several metrics of political and economic well-being. Our results indicate that the impacts of changes in precipitation and runoff are distributed relatively equally between developed and developing nations. In contrast, we confirm research suggesting that developing nations will be affected far more severely by changes in temperature than developed nations. Our results also suggest that this unequal impact will persist throughout the twenty-first century. Our analysis further indicates that the most significant temperature changes will occur in politically unstable countries, creating an additional motivation for developed countries to actively engage with developing nations on climate mitigation strategies. © 2011, Mary Ann Liebert, Inc.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

PURPOSE: To investigate the dosimetric effects of adaptive planning on lung stereotactic body radiation therapy (SBRT). METHODS AND MATERIALS: Forty of 66 consecutive lung SBRT patients were selected for a retrospective adaptive planning study. CBCT images acquired at each fraction were used for treatment planning. Adaptive plans were created using the same planning parameters as the original CT-based plan, with the goal to achieve comparable comformality index (CI). For each patient, 2 cumulative plans, nonadaptive plan (PNON) and adaptive plan (PADP), were generated and compared for the following organs-at-risks (OARs): cord, esophagus, chest wall, and the lungs. Dosimetric comparison was performed between PNON and PADP for all 40 patients. Correlations were evaluated between changes in dosimetric metrics induced by adaptive planning and potential impacting factors, including tumor-to-OAR distances (dT-OAR), initial internal target volume (ITV1), ITV change (ΔITV), and effective ITV diameter change (ΔdITV). RESULTS: 34 (85%) patients showed ITV decrease and 6 (15%) patients showed ITV increase throughout the course of lung SBRT. Percentage ITV change ranged from -59.6% to 13.0%, with a mean (±SD) of -21.0% (±21.4%). On average of all patients, PADP resulted in significantly (P=0 to .045) lower values for all dosimetric metrics. ΔdITV/dT-OAR was found to correlate with changes in dose to 5 cc (ΔD5cc) of esophagus (r=0.61) and dose to 30 cc (ΔD30cc) of chest wall (r=0.81). Stronger correlations between ΔdITV/dT-OAR and ΔD30cc of chest wall were discovered for peripheral (r=0.81) and central (r=0.84) tumors, respectively. CONCLUSIONS: Dosimetric effects of adaptive lung SBRT planning depend upon target volume changes and tumor-to-OAR distances. Adaptive lung SBRT can potentially reduce dose to adjacent OARs if patients present large tumor volume shrinkage during the treatment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

To maintain a strict balance between demand and supply in the US power systems, the Independent System Operators (ISOs) schedule power plants and determine electricity prices using a market clearing model. This model determines for each time period and power plant, the times of startup, shutdown, the amount of power production, and the provisioning of spinning and non-spinning power generation reserves, etc. Such a deterministic optimization model takes as input the characteristics of all the generating units such as their power generation installed capacity, ramp rates, minimum up and down time requirements, and marginal costs for production, as well as the forecast of intermittent energy such as wind and solar, along with the minimum reserve requirement of the whole system. This reserve requirement is determined based on the likelihood of outages on the supply side and on the levels of error forecasts in demand and intermittent generation. With increased installed capacity of intermittent renewable energy, determining the appropriate level of reserve requirements has become harder. Stochastic market clearing models have been proposed as an alternative to deterministic market clearing models. Rather than using a fixed reserve targets as an input, stochastic market clearing models take different scenarios of wind power into consideration and determine reserves schedule as output. Using a scaled version of the power generation system of PJM, a regional transmission organization (RTO) that coordinates the movement of wholesale electricity in all or parts of 13 states and the District of Columbia, and wind scenarios generated from BPA (Bonneville Power Administration) data, this paper explores a comparison of the performance between a stochastic and deterministic model in market clearing. The two models are compared in their ability to contribute to the affordability, reliability and sustainability of the electricity system, measured in terms of total operational costs, load shedding and air emissions. The process of building the models and running for tests indicate that a fair comparison is difficult to obtain due to the multi-dimensional performance metrics considered here, and the difficulty in setting up the parameters of the models in a way that does not advantage or disadvantage one modeling framework. Along these lines, this study explores the effect that model assumptions such as reserve requirements, value of lost load (VOLL) and wind spillage costs have on the comparison of the performance of stochastic vs deterministic market clearing models.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Despite a large and multifaceted effort to understand the vast landscape of phenotypic data, their current form inhibits productive data analysis. The lack of a community-wide, consensus-based, human- and machine-interpretable language for describing phenotypes and their genomic and environmental contexts is perhaps the most pressing scientific bottleneck to integration across many key fields in biology, including genomics, systems biology, development, medicine, evolution, ecology, and systematics. Here we survey the current phenomics landscape, including data resources and handling, and the progress that has been made to accurately capture relevant data descriptions for phenotypes. We present an example of the kind of integration across domains that computable phenotypes would enable, and we call upon the broader biology community, publishers, and relevant funding agencies to support efforts to surmount today's data barriers and facilitate analytical reproducibility.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: Parrots belong to a group of behaviorally advanced vertebrates and have an advanced ability of vocal learning relative to other vocal-learning birds. They can imitate human speech, synchronize their body movements to a rhythmic beat, and understand complex concepts of referential meaning to sounds. However, little is known about the genetics of these traits. Elucidating the genetic bases would require whole genome sequencing and a robust assembly of a parrot genome. FINDINGS: We present a genomic resource for the budgerigar, an Australian Parakeet (Melopsittacus undulatus) -- the most widely studied parrot species in neuroscience and behavior. We present genomic sequence data that includes over 300× raw read coverage from multiple sequencing technologies and chromosome optical maps from a single male animal. The reads and optical maps were used to create three hybrid assemblies representing some of the largest genomic scaffolds to date for a bird; two of which were annotated based on similarities to reference sets of non-redundant human, zebra finch and chicken proteins, and budgerigar transcriptome sequence assemblies. The sequence reads for this project were in part generated and used for both the Assemblathon 2 competition and the first de novo assembly of a giga-scale vertebrate genome utilizing PacBio single-molecule sequencing. CONCLUSIONS: Across several quality metrics, these budgerigar assemblies are comparable to or better than the chicken and zebra finch genome assemblies built from traditional Sanger sequencing reads, and are sufficient to analyze regions that are difficult to sequence and assemble, including those not yet assembled in prior bird genomes, and promoter regions of genes differentially regulated in vocal learning brain regions. This work provides valuable data and material for genome technology development and for investigating the genomics of complex behavioral traits.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

PURPOSE: X-ray computed tomography (CT) is widely used, both clinically and preclinically, for fast, high-resolution anatomic imaging; however, compelling opportunities exist to expand its use in functional imaging applications. For instance, spectral information combined with nanoparticle contrast agents enables quantification of tissue perfusion levels, while temporal information details cardiac and respiratory dynamics. The authors propose and demonstrate a projection acquisition and reconstruction strategy for 5D CT (3D+dual energy+time) which recovers spectral and temporal information without substantially increasing radiation dose or sampling time relative to anatomic imaging protocols. METHODS: The authors approach the 5D reconstruction problem within the framework of low-rank and sparse matrix decomposition. Unlike previous work on rank-sparsity constrained CT reconstruction, the authors establish an explicit rank-sparse signal model to describe the spectral and temporal dimensions. The spectral dimension is represented as a well-sampled time and energy averaged image plus regularly undersampled principal components describing the spectral contrast. The temporal dimension is represented as the same time and energy averaged reconstruction plus contiguous, spatially sparse, and irregularly sampled temporal contrast images. Using a nonlinear, image domain filtration approach, the authors refer to as rank-sparse kernel regression, the authors transfer image structure from the well-sampled time and energy averaged reconstruction to the spectral and temporal contrast images. This regularization strategy strictly constrains the reconstruction problem while approximately separating the temporal and spectral dimensions. Separability results in a highly compressed representation for the 5D data in which projections are shared between the temporal and spectral reconstruction subproblems, enabling substantial undersampling. The authors solved the 5D reconstruction problem using the split Bregman method and GPU-based implementations of backprojection, reprojection, and kernel regression. Using a preclinical mouse model, the authors apply the proposed algorithm to study myocardial injury following radiation treatment of breast cancer. RESULTS: Quantitative 5D simulations are performed using the MOBY mouse phantom. Twenty data sets (ten cardiac phases, two energies) are reconstructed with 88 μm, isotropic voxels from 450 total projections acquired over a single 360° rotation. In vivo 5D myocardial injury data sets acquired in two mice injected with gold and iodine nanoparticles are also reconstructed with 20 data sets per mouse using the same acquisition parameters (dose: ∼60 mGy). For both the simulations and the in vivo data, the reconstruction quality is sufficient to perform material decomposition into gold and iodine maps to localize the extent of myocardial injury (gold accumulation) and to measure cardiac functional metrics (vascular iodine). Their 5D CT imaging protocol represents a 95% reduction in radiation dose per cardiac phase and energy and a 40-fold decrease in projection sampling time relative to their standard imaging protocol. CONCLUSIONS: Their 5D CT data acquisition and reconstruction protocol efficiently exploits the rank-sparse nature of spectral and temporal CT data to provide high-fidelity reconstruction results without increased radiation dose or sampling time.