5 resultados para Dynamic search fireworks algorithm with covariance mutation
em Duke University
Resumo:
Angelman syndrome (AS) is a neurobehavioral disorder associated with mental retardation, absence of language development, characteristic electroencephalography (EEG) abnormalities and epilepsy, happy disposition, movement or balance disorders, and autistic behaviors. The molecular defects underlying AS are heterogeneous, including large maternal deletions of chromosome 15q11-q13 (70%), paternal uniparental disomy (UPD) of chromosome 15 (5%), imprinting mutations (rare), and mutations in the E6-AP ubiquitin ligase gene UBE3A (15%). Although patients with UBE3A mutations have a wide spectrum of neurological phenotypes, their features are usually milder than AS patients with deletions of 15q11-q13. Using a chromosomal engineering strategy, we generated mutant mice with a 1.6-Mb chromosomal deletion from Ube3a to Gabrb3, which inactivated the Ube3a and Gabrb3 genes and deleted the Atp10a gene. Homozygous deletion mutant mice died in the perinatal period due to a cleft palate resulting from the null mutation in Gabrb3 gene. Mice with a maternal deletion (m-/p+) were viable and did not have any obvious developmental defects. Expression analysis of the maternal and paternal deletion mice confirmed that the Ube3a gene is maternally expressed in brain, and showed that the Atp10a and Gabrb3 genes are biallelically expressed in all brain sub-regions studied. Maternal (m-/p+), but not paternal (m+/p-), deletion mice had increased spontaneous seizure activity and abnormal EEG. Extensive behavioral analyses revealed significant impairment in motor function, learning and memory tasks, and anxiety-related measures assayed in the light-dark box in maternal deletion but not paternal deletion mice. Ultrasonic vocalization (USV) recording in newborns revealed that maternal deletion pups emitted significantly more USVs than wild-type littermates. The increased USV in maternal deletion mice suggests abnormal signaling behavior between mothers and pups that may reflect abnormal communication behaviors in human AS patients. Thus, mutant mice with a maternal deletion from Ube3a to Gabrb3 provide an AS mouse model that is molecularly more similar to the contiguous gene deletion form of AS in humans than mice with Ube3a mutation alone. These mice will be valuable for future comparative studies to mice with maternal deficiency of Ube3a alone.
Resumo:
BACKGROUND: The MitoChip v2.0 resequencing array is an array-based technique allowing for accurate and complete sequencing of the mitochondrial genome. No studies have investigated mitochondrial mutation in salivary gland adenoid cystic carcinomas. METHODOLOGY: The entire mitochondrial genome of 22 salivary gland adenoid cystic carcinomas (ACC) of salivary glands and matched leukocyte DNA was sequenced to determine the frequency and distribution of mitochondrial mutations in ACC tumors. PRINCIPAL FINDINGS: Seventeen of 22 ACCs (77%) carried mitochondrial mutations, ranging in number from 1 to 37 mutations. A disproportionate number of mutations occurred in the D-loop. Twelve of 17 tumors (70.6%) carried mutations resulting in amino acid changes of translated proteins. Nine of 17 tumors (52.9%) with a mutation carried an amino acid changing mutation in the nicotinamide adenine dinucleotide dehydrogenase (NADH) complex. CONCLUSIONS/SIGNIFICANCE: Mitochondrial mutation is frequent in salivary ACCs. The high incidence of amino acid changing mutations implicates alterations in aerobic respiration in ACC carcinogenesis. D-loop mutations are of unclear significance, but may be associated with alterations in transcription or replication.
Resumo:
© 2005-2012 IEEE.Within industrial automation systems, three-dimensional (3-D) vision provides very useful feedback information in autonomous operation of various manufacturing equipment (e.g., industrial robots, material handling devices, assembly systems, and machine tools). The hardware performance in contemporary 3-D scanning devices is suitable for online utilization. However, the bottleneck is the lack of real-time algorithms for recognition of geometric primitives (e.g., planes and natural quadrics) from a scanned point cloud. One of the most important and the most frequent geometric primitive in various engineering tasks is plane. In this paper, we propose a new fast one-pass algorithm for recognition (segmentation and fitting) of planar segments from a point cloud. To effectively segment planar regions, we exploit the orthonormality of certain wavelets to polynomial function, as well as their sensitivity to abrupt changes. After segmentation of planar regions, we estimate the parameters of corresponding planes using standard fitting procedures. For point cloud structuring, a z-buffer algorithm with mesh triangles representation in barycentric coordinates is employed. The proposed recognition method is tested and experimentally validated in several real-world case studies.
Resumo:
Purpose: To investigate the effect of incorporating a beam spreading parameter in a beam angle optimization algorithm and to evaluate its efficacy for creating coplanar IMRT lung plans in conjunction with machine learning generated dose objectives.
Methods: Fifteen anonymized patient cases were each re-planned with ten values over the range of the beam spreading parameter, k, and analyzed with a Wilcoxon signed-rank test to determine whether any particular value resulted in significant improvement over the initially treated plan created by a trained dosimetrist. Dose constraints were generated by a machine learning algorithm and kept constant for each case across all k values. Parameters investigated for potential improvement included mean lung dose, V20 lung, V40 heart, 80% conformity index, and 90% conformity index.
Results: With a confidence level of 5%, treatment plans created with this method resulted in significantly better conformity indices. Dose coverage to the PTV was improved by an average of 12% over the initial plans. At the same time, these treatment plans showed no significant difference in mean lung dose, V20 lung, or V40 heart when compared to the initial plans; however, it should be noted that these results could be influenced by the small sample size of patient cases.
Conclusions: The beam angle optimization algorithm, with the inclusion of the beam spreading parameter k, increases the dose conformity of the automatically generated treatment plans over that of the initial plans without adversely affecting the dose to organs at risk. This parameter can be varied according to physician preference in order to control the tradeoff between dose conformity and OAR sparing without compromising the integrity of the plan.
Resumo:
Fitting statistical models is computationally challenging when the sample size or the dimension of the dataset is huge. An attractive approach for down-scaling the problem size is to first partition the dataset into subsets and then fit using distributed algorithms. The dataset can be partitioned either horizontally (in the sample space) or vertically (in the feature space), and the challenge arise in defining an algorithm with low communication, theoretical guarantees and excellent practical performance in general settings. For sample space partitioning, I propose a MEdian Selection Subset AGgregation Estimator ({\em message}) algorithm for solving these issues. The algorithm applies feature selection in parallel for each subset using regularized regression or Bayesian variable selection method, calculates the `median' feature inclusion index, estimates coefficients for the selected features in parallel for each subset, and then averages these estimates. The algorithm is simple, involves very minimal communication, scales efficiently in sample size, and has theoretical guarantees. I provide extensive experiments to show excellent performance in feature selection, estimation, prediction, and computation time relative to usual competitors.
While sample space partitioning is useful in handling datasets with large sample size, feature space partitioning is more effective when the data dimension is high. Existing methods for partitioning features, however, are either vulnerable to high correlations or inefficient in reducing the model dimension. In the thesis, I propose a new embarrassingly parallel framework named {\em DECO} for distributed variable selection and parameter estimation. In {\em DECO}, variables are first partitioned and allocated to m distributed workers. The decorrelated subset data within each worker are then fitted via any algorithm designed for high-dimensional problems. We show that by incorporating the decorrelation step, DECO can achieve consistent variable selection and parameter estimation on each subset with (almost) no assumptions. In addition, the convergence rate is nearly minimax optimal for both sparse and weakly sparse models and does NOT depend on the partition number m. Extensive numerical experiments are provided to illustrate the performance of the new framework.
For datasets with both large sample sizes and high dimensionality, I propose a new "divided-and-conquer" framework {\em DEME} (DECO-message) by leveraging both the {\em DECO} and the {\em message} algorithm. The new framework first partitions the dataset in the sample space into row cubes using {\em message} and then partition the feature space of the cubes using {\em DECO}. This procedure is equivalent to partitioning the original data matrix into multiple small blocks, each with a feasible size that can be stored and fitted in a computer in parallel. The results are then synthezied via the {\em DECO} and {\em message} algorithm in a reverse order to produce the final output. The whole framework is extremely scalable.