8 resultados para Open space residential design (OSRD)
em Duke University
Resumo:
Proteins are essential components of cells and are crucial for catalyzing reactions, signaling, recognition, motility, recycling, and structural stability. This diversity of function suggests that nature is only scratching the surface of protein functional space. Protein function is determined by structure, which in turn is determined predominantly by amino acid sequence. Protein design aims to explore protein sequence and conformational space to design novel proteins with new or improved function. The vast number of possible protein sequences makes exploring the space a challenging problem.
Computational structure-based protein design (CSPD) allows for the rational design of proteins. Because of the large search space, CSPD methods must balance search accuracy and modeling simplifications. We have developed algorithms that allow for the accurate and efficient search of protein conformational space. Specifically, we focus on algorithms that maintain provability, account for protein flexibility, and use ensemble-based rankings. We present several novel algorithms for incorporating improved flexibility into CSPD with continuous rotamers. We applied these algorithms to two biomedically important design problems. We designed peptide inhibitors of the cystic fibrosis agonist CAL that were able to restore function of the vital cystic fibrosis protein CFTR. We also designed improved HIV antibodies and nanobodies to combat HIV infections.
Resumo:
In our continuing study of triterpene derivatives as potent anti-HIV agents, different C-3 conformationally restricted betulinic acid (BA, 1) derivatives were designed and synthesized in order to explore the conformational space of the C-3 pharmacophore. 3-O-Monomethylsuccinyl-betulinic acid (MSB) analogues were also designed to better understand the contribution of the C-3' dimethyl group of bevirimat (2), the first-in-class HIV maturation inhibitor, which is currently in phase IIb clinical trials. In addition, another triterpene skeleton, moronic acid (MA, 3), was also employed to study the influence of the backbone and the C-3 modification toward the anti-HIV activity of this compound class. This study enabled us to better understand the structure-activity relationships (SAR) of triterpene-derived anti-HIV agents and led to the design and synthesis of compound 12 (EC(50): 0.0006 microM), which displayed slightly better activity than 2 as a HIV-1 maturation inhibitor.
Resumo:
Maps are a mainstay of visual, somatosensory, and motor coding in many species. However, auditory maps of space have not been reported in the primate brain. Instead, recent studies have suggested that sound location may be encoded via broadly responsive neurons whose firing rates vary roughly proportionately with sound azimuth. Within frontal space, maps and such rate codes involve different response patterns at the level of individual neurons. Maps consist of neurons exhibiting circumscribed receptive fields, whereas rate codes involve open-ended response patterns that peak in the periphery. This coding format discrepancy therefore poses a potential problem for brain regions responsible for representing both visual and auditory information. Here, we investigated the coding of auditory space in the primate superior colliculus(SC), a structure known to contain visual and oculomotor maps for guiding saccades. We report that, for visual stimuli, neurons showed circumscribed receptive fields consistent with a map, but for auditory stimuli, they had open-ended response patterns consistent with a rate or level-of-activity code for location. The discrepant response patterns were not segregated into different neural populations but occurred in the same neurons. We show that a read-out algorithm in which the site and level of SC activity both contribute to the computation of stimulus location is successful at evaluating the discrepant visual and auditory codes, and can account for subtle but systematic differences in the accuracy of auditory compared to visual saccades. This suggests that a given population of neurons can use different codes to support appropriate multimodal behavior.
Resumo:
Given a probability distribution on an open book (a metric space obtained by gluing a disjoint union of copies of a half-space along their boundary hyperplanes), we define a precise concept of when the Fréchet mean (barycenter) is sticky. This nonclassical phenomenon is quantified by a law of large numbers (LLN) stating that the empirical mean eventually almost surely lies on the (codimension 1 and hence measure 0) spine that is the glued hyperplane, and a central limit theorem (CLT) stating that the limiting distribution is Gaussian and supported on the spine.We also state versions of the LLN and CLT for the cases where the mean is nonsticky (i.e., not lying on the spine) and partly sticky (i.e., is, on the spine but not sticky). © Institute of Mathematical Statistics, 2013.
Resumo:
The Duke University Medical Center Library and Archives is located in the heart of the Duke Medicine campus, surrounded by Duke Hospital, ambulatory clinics, and numerous research facilities. Its location is considered prime real estate, given its adjacency to patient care, research, and educational activities. In 2005, the Duke University Library Space Planning Committee had recommended creating a learning center in the library that would support a variety of educational activities. However, the health system needed to convert the library's top floor into office space to make way for expansion of the hospital and cancer center. The library had only five months to plan the storage and consolidation of its journal and book collections, while working with the facilities design office and architect on the replacement of key user spaces on the top floor. Library staff worked together to develop plans for storing, weeding, and consolidating the collections and provided input into renovation plans for users spaces on its mezzanine level. The library lost 15,238 square feet (29%) of its net assignable square footage and a total of 16,897 (30%) gross square feet. This included 50% of the total space allotted to collections and over 15% of user spaces. The top-floor space now houses offices for Duke Medicine oncology faculty and staff. By storing a large portion of its collection off-site, the library was able to remove more stacks on the remaining stack level and convert them to user spaces, a long-term goal for the library. Additional space on the mezzanine level had to be converted to replace lost study and conference room spaces. While this project did not match the recommended space plans for the library, it underscored the need for the library to think creatively about the future of its facility and to work toward a more cohesive master plan.
Resumo:
Scheduling a set of jobs over a collection of machines to optimize a certain quality-of-service measure is one of the most important research topics in both computer science theory and practice. In this thesis, we design algorithms that optimize {\em flow-time} (or delay) of jobs for scheduling problems that arise in a wide range of applications. We consider the classical model of unrelated machine scheduling and resolve several long standing open problems; we introduce new models that capture the novel algorithmic challenges in scheduling jobs in data centers or large clusters; we study the effect of selfish behavior in distributed and decentralized environments; we design algorithms that strive to balance the energy consumption and performance.
The technically interesting aspect of our work is the surprising connections we establish between approximation and online algorithms, economics, game theory, and queuing theory. It is the interplay of ideas from these different areas that lies at the heart of most of the algorithms presented in this thesis.
The main contributions of the thesis can be placed in one of the following categories.
1. Classical Unrelated Machine Scheduling: We give the first polygorithmic approximation algorithms for minimizing the average flow-time and minimizing the maximum flow-time in the offline setting. In the online and non-clairvoyant setting, we design the first non-clairvoyant algorithm for minimizing the weighted flow-time in the resource augmentation model. Our work introduces iterated rounding technique for the offline flow-time optimization, and gives the first framework to analyze non-clairvoyant algorithms for unrelated machines.
2. Polytope Scheduling Problem: To capture the multidimensional nature of the scheduling problems that arise in practice, we introduce Polytope Scheduling Problem (\psp). The \psp problem generalizes almost all classical scheduling models, and also captures hitherto unstudied scheduling problems such as routing multi-commodity flows, routing multicast (video-on-demand) trees, and multi-dimensional resource allocation. We design several competitive algorithms for the \psp problem and its variants for the objectives of minimizing the flow-time and completion time. Our work establishes many interesting connections between scheduling and market equilibrium concepts, fairness and non-clairvoyant scheduling, and queuing theoretic notion of stability and resource augmentation analysis.
3. Energy Efficient Scheduling: We give the first non-clairvoyant algorithm for minimizing the total flow-time + energy in the online and resource augmentation model for the most general setting of unrelated machines.
4. Selfish Scheduling: We study the effect of selfish behavior in scheduling and routing problems. We define a fairness index for scheduling policies called {\em bounded stretch}, and show that for the objective of minimizing the average (weighted) completion time, policies with small stretch lead to equilibrium outcomes with small price of anarchy. Our work gives the first linear/ convex programming duality based framework to bound the price of anarchy for general equilibrium concepts such as coarse correlated equilibrium.
Resumo:
BACKGROUND: Lumbar disc herniation has a prevalence of up to 58% in the athletic population. Lumbar discectomy is a common surgical procedure to alleviate pain and disability in athletes. We systematically reviewed the current clinical evidence regarding athlete return to sport (RTS) following lumbar discectomy compared to conservative treatment. METHODS: A computer-assisted literature search of MEDLINE, CINAHL, Web of Science, PEDro, OVID and PubMed databases (from inception to August 2015) was utilised using keywords related to lumbar disc herniation and surgery. The design of this systematic review was developed using the guidelines of the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA). Methodological quality of individual studies was assessed using the Downs and Black scale (0-16 points). RESULTS: The search strategy revealed 14 articles. Downs and Black quality scores were generally low with no articles in this review earning a high-quality rating, only 5 articles earning a moderate quality rating and 9 of the 14 articles earning a low-quality rating. The pooled RTS for surgical intervention of all included studies was 81% (95% CI 76% to 86%) with significant heterogeneity (I(2)=63.4%, p<0.001) although pooled estimates report only 59% RTS at same level. Pooled analysis showed no difference in RTS rate between surgical (84% (95% CI 77% to 90%)) and conservative intervention (76% (95% CI 56% to 92%); p=0.33). CONCLUSIONS: Studies comparing surgical versus conservative treatment found no significant difference between groups regarding RTS. Not all athletes that RTS return at the level of participation they performed at prior to surgery. Owing to the heterogeneity and low methodological quality of included studies, rates of RTS cannot be accurately determined.
Resumo:
This chapter presents a model averaging approach in the M-open setting using sample re-use methods to approximate the predictive distribution of future observations. It first reviews the standard M-closed Bayesian Model Averaging approach and decision-theoretic methods for producing inferences and decisions. It then reviews model selection from the M-complete and M-open perspectives, before formulating a Bayesian solution to model averaging in the M-open perspective. It constructs optimal weights for MOMA:M-open Model Averaging using a decision-theoretic framework, where models are treated as part of the ‘action space’ rather than unknown states of nature. Using ‘incompatible’ retrospective and prospective models for data from a case-control study, the chapter demonstrates that MOMA gives better predictive accuracy than the proxy models. It concludes with open questions and future directions.