10 resultados para Tool Paths
em Duke University
Resumo:
Timing-related defects are major contributors to test escapes and in-field reliability problems for very-deep submicrometer integrated circuits. Small delay variations induced by crosstalk, process variations, power-supply noise, as well as resistive opens and shorts can potentially cause timing failures in a design, thereby leading to quality and reliability concerns. We present a test-grading technique that uses the method of output deviations for screening small-delay defects (SDDs). A new gate-delay defect probability measure is defined to model delay variations for nanometer technologies. The proposed technique intelligently selects the best set of patterns for SDD detection from an n-detect pattern set generated using timing-unaware automatic test-pattern generation (ATPG). It offers significantly lower computational complexity and excites a larger number of long paths compared to a current generation commercial timing-aware ATPG tool. Our results also show that, for the same pattern count, the selected patterns provide more effective coverage ramp-up than timing-aware ATPG and a recent pattern-selection method for random SDDs potentially caused by resistive shorts, resistive opens, and process variations. © 2010 IEEE.
Resumo:
Excessive iron absorption is one of the main features of β-thalassemia and can lead to severe morbidity and mortality. Serial analyses of β-thalassemic mice indicate that while hemoglobin levels decrease over time, the concentration of iron in the liver, spleen, and kidneys markedly increases. Iron overload is associated with low levels of hepcidin, a peptide that regulates iron metabolism by triggering degradation of ferroportin, an iron-transport protein localized on absorptive enterocytes as well as hepatocytes and macrophages. Patients with β-thalassemia also have low hepcidin levels. These observations led us to hypothesize that more iron is absorbed in β-thalassemia than is required for erythropoiesis and that increasing the concentration of hepcidin in the body of such patients might be therapeutic, limiting iron overload. Here we demonstrate that a moderate increase in expression of hepcidin in β-thalassemic mice limits iron overload, decreases formation of insoluble membrane-bound globins and reactive oxygen species, and improves anemia. Mice with increased hepcidin expression also demonstrated an increase in the lifespan of their red cells, reversal of ineffective erythropoiesis and splenomegaly, and an increase in total hemoglobin levels. These data led us to suggest that therapeutics that could increase hepcidin levels or act as hepcidin agonists might help treat the abnormal iron absorption in individuals with β-thalassemia and related disorders.
Resumo:
BACKGROUND: Over the past two decades more than fifty thousand unique clinical and biological samples have been assayed using the Affymetrix HG-U133 and HG-U95 GeneChip microarray platforms. This substantial repository has been used extensively to characterize changes in gene expression between biological samples, but has not been previously mined en masse for changes in mRNA processing. We explored the possibility of using HG-U133 microarray data to identify changes in alternative mRNA processing in several available archival datasets. RESULTS: Data from these and other gene expression microarrays can now be mined for changes in transcript isoform abundance using a program described here, SplicerAV. Using in vivo and in vitro breast cancer microarray datasets, SplicerAV was able to perform both gene and isoform specific expression profiling within the same microarray dataset. Our reanalysis of Affymetrix U133 plus 2.0 data generated by in vitro over-expression of HRAS, E2F3, beta-catenin (CTNNB1), SRC, and MYC identified several hundred oncogene-induced mRNA isoform changes, one of which recognized a previously unknown mechanism of EGFR family activation. Using clinical data, SplicerAV predicted 241 isoform changes between low and high grade breast tumors; with changes enriched among genes coding for guanyl-nucleotide exchange factors, metalloprotease inhibitors, and mRNA processing factors. Isoform changes in 15 genes were associated with aggressive cancer across the three breast cancer datasets. CONCLUSIONS: Using SplicerAV, we identified several hundred previously uncharacterized isoform changes induced by in vitro oncogene over-expression and revealed a previously unknown mechanism of EGFR activation in human mammary epithelial cells. We analyzed Affymetrix GeneChip data from over 400 human breast tumors in three independent studies, making this the largest clinical dataset analyzed for en masse changes in alternative mRNA processing. The capacity to detect RNA isoform changes in archival microarray data using SplicerAV allowed us to carry out the first analysis of isoform specific mRNA changes directly associated with cancer survival.
Resumo:
An enduring challenge for the policy and political sciences is valid and reliable depiction of policy designs. One emerging approach for dissecting policy designs is the application of Sue Crawford and Elinor Ostrom's institutional grammar tool. The grammar tool offers a method to identify, systematically, the core elements that comprise policies, including target audiences, expected patterns of behavior, and formal modes of sanctioning for noncompliance. This article provides three contributions to the study of policy designs by developing and applying the institutional grammar tool. First, we provide revised guidelines for applying the institutional grammar tool to the study of policy design. Second, an additional component to the grammar, called the oBject, is introduced. Third, we apply the modified grammar tool to four policies that shape Colorado State Aquaculture to demonstrate its effectiveness and utility in illuminating institutional linkages across levels of analysis. The conclusion summarizes the contributions of the article as well as points to future research and applications of the institutional grammar tool. © 2011 Policy Studies Organization.
Resumo:
What is the relationship between the design of regulations and levels of individual compliance? To answer this question, Crawford and Ostrom's institutional grammar tool is used to deconstruct regulations governing the aquaculture industry in Colorado, USA. Compliance with the deconstructed regulatory components is then assessed based on the perceptions of the appropriateness of the regulations, involvement in designing the regulations, and intrinsic and extrinsic motivations. The findings suggest that levels of compliance with regulations vary across and within individuals regarding various aspects of the regulatory components. As expected, the level of compliance is affected by the perceived appropriateness of regulations, participation in designing the regulations, and feelings of guilt and fear of social disapproval. Furthermore, there is a strong degree of interdependence among the written components, as identified by the institutional grammar tool, in affecting compliance levels. The paper contributes to the regulation and compliance literature by illustrating the utility of the institutional grammar tool in understanding regulatory content, applying a new Q-Sort technique for measuring individual levels of compliance, and providing a rare exploration into feelings of guilt and fear outside of the laboratory setting. © 2012 Blackwell Publishing Asia Pty Ltd.
Resumo:
A significant challenge in environmental toxicology is that many genetic and genomic tools available in laboratory models are not developed for commonly used environmental models. The Atlantic killifish (Fundulus heteroclitus) is one of the most studied teleost environmental models, yet few genetic or genomic tools have been developed for use in this species. The advancement of genetic and evolutionary toxicology will require that many of the tools developed in laboratory models be transferred into species more applicable to environmental toxicology. Antisense morpholino oligonucleotide (MO) gene knockdown technology has been widely utilized to study development in zebrafish and has been proven to be a powerful tool in toxicological investigations through direct manipulation of molecular pathways. To expand the utility of killifish as an environmental model, MO gene knockdown technology was adapted for use in Fundulus. Morpholino microinjection methods were altered to overcome the significant differences between these two species. Morpholino efficacy and functional duration were evaluated with molecular and phenotypic methods. A cytochrome P450-1A (CYP1A) MO was used to confirm effectiveness of the methodology. For CYP1A MO-injected embryos, a 70% reduction in CYP1A activity, a 86% reduction in total CYP1A protein, a significant increase in beta-naphthoflavone-induced teratogenicity, and estimates of functional duration (50% reduction in activity 10 dpf, and 86% reduction in total protein 12 dpf) conclusively demonstrated that MO technologies can be used effectively in killifish and will likely be just as informative as they have been in zebrafish.
Resumo:
Nolan and Temple Lang argue that “the ability to express statistical computations is an es- sential skill.” A key related capacity is the ability to conduct and present data analysis in a way that another person can understand and replicate. The copy-and-paste workflow that is an artifact of antiquated user-interface design makes reproducibility of statistical analysis more difficult, especially as data become increasingly complex and statistical methods become increasingly sophisticated. R Markdown is a new technology that makes creating fully-reproducible statistical analysis simple and painless. It provides a solution suitable not only for cutting edge research, but also for use in an introductory statistics course. We present experiential and statistical evidence that R Markdown can be used effectively in introductory statistics courses, and discuss its role in the rapidly-changing world of statistical computation.
Resumo:
Family dogs and dog owners offer a potentially powerful way to conduct citizen science to answer questions about animal behavior that are difficult to answer with more conventional approaches. Here we evaluate the quality of the first data on dog cognition collected by citizen scientists using the Dognition.com website. We conducted analyses to understand if data generated by over 500 citizen scientists replicates internally and in comparison to previously published findings. Half of participants participated for free while the other half paid for access. The website provided each participant a temperament questionnaire and instructions on how to conduct a series of ten cognitive tests. Participation required internet access, a dog and some common household items. Participants could record their responses on any PC, tablet or smartphone from anywhere in the world and data were retained on servers. Results from citizen scientists and their dogs replicated a number of previously described phenomena from conventional lab-based research. There was little evidence that citizen scientists manipulated their results. To illustrate the potential uses of relatively large samples of citizen science data, we then used factor analysis to examine individual differences across the cognitive tasks. The data were best explained by multiple factors in support of the hypothesis that nonhumans, including dogs, can evolve multiple cognitive domains that vary independently. This analysis suggests that in the future, citizen scientists will generate useful datasets that test hypotheses and answer questions as a complement to conventional laboratory techniques used to study dog psychology.
Resumo:
Software-based control of life-critical embedded systems has become increasingly complex, and to a large extent has come to determine the safety of the human being. For example, implantable cardiac pacemakers have over 80,000 lines of code which are responsible for maintaining the heart within safe operating limits. As firmware-related recalls accounted for over 41% of the 600,000 devices recalled in the last decade, there is a need for rigorous model-driven design tools to generate verified code from verified software models. To this effect, we have developed the UPP2SF model-translation tool, which facilitates automatic conversion of verified models (in UPPAAL) to models that may be simulated and tested (in Simulink/Stateflow). We describe the translation rules that ensure correct model conversion, applicable to a large class of models. We demonstrate how UPP2SF is used in themodel-driven design of a pacemaker whosemodel is (a) designed and verified in UPPAAL (using timed automata), (b) automatically translated to Stateflow for simulation-based testing, and then (c) automatically generated into modular code for hardware-level integration testing of timing-related errors. In addition, we show how UPP2SF may be used for worst-case execution time estimation early in the design stage. Using UPP2SF, we demonstrate the value of integrated end-to-end modeling, verification, code-generation and testing process for complex software-controlled embedded systems. © 2014 ACM.