982 resultados para Chip-tool interfaces


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This research investigates some of the reasons for the reported difficulties experienced by writers when using editing software designed for structured documents. The overall objective was to determine if there are aspects of the software interfaces which militate against optimal document construction by writers who are not computer experts, and to suggest possible remedies. Studies were undertaken to explore the nature and extent of the difficulties, and to identify which components of the software interfaces are involved. A model of a revised user interface was tested, and some possible adaptations to the interface are proposed which may help overcome the difficulties. The methodology comprised: 1. identification and description of the nature of a ‘structured document’ and what distinguishes it from other types of document used on computers; 2. isolation of the requirements of users of such documents, and the construction a set of personas which describe them; 3. evaluation of other work on the interaction between humans and computers, specifically in software for creating and editing structured documents; 4. estimation of the levels of adoption of the available software for editing structured documents and the reactions of existing users to it, with specific reference to difficulties encountered in using it; 5. examination of the software and identification of any mismatches between the expectations of users and the facilities provided by the software; 6. assessment of any physical or psychological factors in the reported difficulties experienced, and to determine what (if any) changes to the software might affect these. The conclusions are that seven of the twelve modifications tested could contribute to an improvement in usability, effectiveness, and efficiency when writing structured text (new document selection; adding new sections and new lists; identifying key information typographically; the creation of cross-references and bibliographic references; and the inclusion of parts of other documents). The remaining five were seen as more applicable to editing existing material than authoring new text (adding new elements; splitting and joining elements [before and after]; and moving block text).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Amorphous silicon has become the material of choice for many technologies, with major applications in large area electronics: displays, image sensing and thin film photovoltaic cells. This technology development has occurred because amorphous silicon is a thin film semiconductor that can be deposited on large, low cost substrates using low temperature. In this thesis, classical molecular dynamics and first principles DFT calculations have been performed to generate structural models of amorphous and hydrogenated amorphous silicon and interfaces of amorphous and crystalline silicon, with the ultimate aim of understanding the photovoltaic properties of core-shell crystalline amorphous Si nanowire structures. We have shown, unexpectedly, from the simulations, that our understanding of hydrogenated bulk a-Si needs to be revisited, with our robust finding that when fully saturated with hydrogen, bulk a-Si exhibits a constant optical energy gap, irrespective of the hydrogen concentration in the sample. Unsaturated a-Si:H, with a lower than optimum hydrogen content, shows a smaller optical gap, that increases with hydrogen content until saturation is reached. The mobility gaps obtained from an analysis of the electronic states show similar behavior. We also obtained that the optical and mobility gaps show a volcano curve as the H content is varied from 7% (undersaturation) to 18% (mild oversaturation). In the case of mild over saturation, the mid-gap states arise exclusively from an increase in the density of strained Si-Si bonds. Analysis of our structures shows the extra H atoms in this case form a bridge between neighboring silicon atoms which increases the corresponding Si-Si distance and promotes bond length disorder in the sample. That has the potential to enhance the Staebler-Wronski effect. Planar interface models of amorphous-crystalline silicon have been generated in Si (100), (110) and (111) surfaces. The interface models are characterized by structure, RDF, electronic density of states and optical absorption spectrum. We find that the least stable (100) surface will result in the formation of the thickest amorphous silicon layer, while the most stable (110) surface forms the smallest amorphous region. We calculated for the first time band offsets of a-Si:H/c-Si heterojunctions from first principles and examined the influence of different surface orientations and amorphous layer thickness on the offsets and implications for device performance. The band offsets depend on the amorphous layer thickness and increase with thickness. By controlling the amorphous layer thickness we can potentially optimise the solar cell parameters. Finally, we have successfully generated different amorphous layer thickness of the a-Si/c-Si and a-Si:H/c-Si 5 nm nanowires from heat and quench. We perform structural analysis of the a-Si-/c-Si nanowires. The RDF, Si-Si bond length distributions, and the coordination number distributions of amorphous regions of the nanowires reproduce similar behaviour compared to bulk amorphous silicon. In the final part of this thesis we examine different surface terminating chemical groups, -H, - OH and –NH2 in (001) GeNW. Our work shows that the diameter of Ge nanowires and the nature of surface terminating groups both play a significant role in both the magnitude and the nature of the nanowire band gaps, allowing tuning of the band gap by up to 1.1 eV. We also show for the first time how the nanowire diameter and surface termination shifts the absorption edge in the Ge nanowires to longer wavelengths. Thus, the combination of nanowire diameter and surface chemistry can be effectively utilised to tune the band gaps and thus light absorption properties of small diameter Ge nanowires.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study involves two aspects of our investigations of plasmonics-active systems: (i) theoretical and simulation studies and (ii) experimental fabrication of plasmonics-active nanostructures. Two types of nanostructures are selected as the model systems for their unique plasmonics properties: (1) nanoparticles and (2) nanowires on substrate. Special focus is devoted to regions where the electromagnetic field is strongly concentrated by the metallic nanostructures or between nanostructures. The theoretical investigations deal with dimers of nanoparticles and nanoshells using a semi-analytical method based on a multipole expansion (ME) and the finite-element method (FEM) in order to determine the electromagnetic enhancement, especially at the interface areas of two adjacent nanoparticles. The experimental study involves the design of plasmonics-active nanowire arrays on substrates that can provide efficient electromagnetic enhancement in regions around and between the nanostructures. Fabrication of these nanowire structures over large chip-scale areas (from a few millimeters to a few centimeters) as well as FDTD simulations to estimate the EM fields between the nanowires are described. The application of these nanowire chips using surface-enhanced Raman scattering (SERS) for detection of chemicals and labeled DNA molecules is described to illustrate the potential of the plasmonics chips for sensing.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Excessive iron absorption is one of the main features of β-thalassemia and can lead to severe morbidity and mortality. Serial analyses of β-thalassemic mice indicate that while hemoglobin levels decrease over time, the concentration of iron in the liver, spleen, and kidneys markedly increases. Iron overload is associated with low levels of hepcidin, a peptide that regulates iron metabolism by triggering degradation of ferroportin, an iron-transport protein localized on absorptive enterocytes as well as hepatocytes and macrophages. Patients with β-thalassemia also have low hepcidin levels. These observations led us to hypothesize that more iron is absorbed in β-thalassemia than is required for erythropoiesis and that increasing the concentration of hepcidin in the body of such patients might be therapeutic, limiting iron overload. Here we demonstrate that a moderate increase in expression of hepcidin in β-thalassemic mice limits iron overload, decreases formation of insoluble membrane-bound globins and reactive oxygen species, and improves anemia. Mice with increased hepcidin expression also demonstrated an increase in the lifespan of their red cells, reversal of ineffective erythropoiesis and splenomegaly, and an increase in total hemoglobin levels. These data led us to suggest that therapeutics that could increase hepcidin levels or act as hepcidin agonists might help treat the abnormal iron absorption in individuals with β-thalassemia and related disorders.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Over the past two decades more than fifty thousand unique clinical and biological samples have been assayed using the Affymetrix HG-U133 and HG-U95 GeneChip microarray platforms. This substantial repository has been used extensively to characterize changes in gene expression between biological samples, but has not been previously mined en masse for changes in mRNA processing. We explored the possibility of using HG-U133 microarray data to identify changes in alternative mRNA processing in several available archival datasets. RESULTS: Data from these and other gene expression microarrays can now be mined for changes in transcript isoform abundance using a program described here, SplicerAV. Using in vivo and in vitro breast cancer microarray datasets, SplicerAV was able to perform both gene and isoform specific expression profiling within the same microarray dataset. Our reanalysis of Affymetrix U133 plus 2.0 data generated by in vitro over-expression of HRAS, E2F3, beta-catenin (CTNNB1), SRC, and MYC identified several hundred oncogene-induced mRNA isoform changes, one of which recognized a previously unknown mechanism of EGFR family activation. Using clinical data, SplicerAV predicted 241 isoform changes between low and high grade breast tumors; with changes enriched among genes coding for guanyl-nucleotide exchange factors, metalloprotease inhibitors, and mRNA processing factors. Isoform changes in 15 genes were associated with aggressive cancer across the three breast cancer datasets. CONCLUSIONS: Using SplicerAV, we identified several hundred previously uncharacterized isoform changes induced by in vitro oncogene over-expression and revealed a previously unknown mechanism of EGFR activation in human mammary epithelial cells. We analyzed Affymetrix GeneChip data from over 400 human breast tumors in three independent studies, making this the largest clinical dataset analyzed for en masse changes in alternative mRNA processing. The capacity to detect RNA isoform changes in archival microarray data using SplicerAV allowed us to carry out the first analysis of isoform specific mRNA changes directly associated with cancer survival.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An enduring challenge for the policy and political sciences is valid and reliable depiction of policy designs. One emerging approach for dissecting policy designs is the application of Sue Crawford and Elinor Ostrom's institutional grammar tool. The grammar tool offers a method to identify, systematically, the core elements that comprise policies, including target audiences, expected patterns of behavior, and formal modes of sanctioning for noncompliance. This article provides three contributions to the study of policy designs by developing and applying the institutional grammar tool. First, we provide revised guidelines for applying the institutional grammar tool to the study of policy design. Second, an additional component to the grammar, called the oBject, is introduced. Third, we apply the modified grammar tool to four policies that shape Colorado State Aquaculture to demonstrate its effectiveness and utility in illuminating institutional linkages across levels of analysis. The conclusion summarizes the contributions of the article as well as points to future research and applications of the institutional grammar tool. © 2011 Policy Studies Organization.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

What is the relationship between the design of regulations and levels of individual compliance? To answer this question, Crawford and Ostrom's institutional grammar tool is used to deconstruct regulations governing the aquaculture industry in Colorado, USA. Compliance with the deconstructed regulatory components is then assessed based on the perceptions of the appropriateness of the regulations, involvement in designing the regulations, and intrinsic and extrinsic motivations. The findings suggest that levels of compliance with regulations vary across and within individuals regarding various aspects of the regulatory components. As expected, the level of compliance is affected by the perceived appropriateness of regulations, participation in designing the regulations, and feelings of guilt and fear of social disapproval. Furthermore, there is a strong degree of interdependence among the written components, as identified by the institutional grammar tool, in affecting compliance levels. The paper contributes to the regulation and compliance literature by illustrating the utility of the institutional grammar tool in understanding regulatory content, applying a new Q-Sort technique for measuring individual levels of compliance, and providing a rare exploration into feelings of guilt and fear outside of the laboratory setting. © 2012 Blackwell Publishing Asia Pty Ltd.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A significant challenge in environmental toxicology is that many genetic and genomic tools available in laboratory models are not developed for commonly used environmental models. The Atlantic killifish (Fundulus heteroclitus) is one of the most studied teleost environmental models, yet few genetic or genomic tools have been developed for use in this species. The advancement of genetic and evolutionary toxicology will require that many of the tools developed in laboratory models be transferred into species more applicable to environmental toxicology. Antisense morpholino oligonucleotide (MO) gene knockdown technology has been widely utilized to study development in zebrafish and has been proven to be a powerful tool in toxicological investigations through direct manipulation of molecular pathways. To expand the utility of killifish as an environmental model, MO gene knockdown technology was adapted for use in Fundulus. Morpholino microinjection methods were altered to overcome the significant differences between these two species. Morpholino efficacy and functional duration were evaluated with molecular and phenotypic methods. A cytochrome P450-1A (CYP1A) MO was used to confirm effectiveness of the methodology. For CYP1A MO-injected embryos, a 70% reduction in CYP1A activity, a 86% reduction in total CYP1A protein, a significant increase in beta-naphthoflavone-induced teratogenicity, and estimates of functional duration (50% reduction in activity 10 dpf, and 86% reduction in total protein 12 dpf) conclusively demonstrated that MO technologies can be used effectively in killifish and will likely be just as informative as they have been in zebrafish.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Gemstone Team ILL (Interactive Language Learning)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nolan and Temple Lang argue that “the ability to express statistical computations is an es- sential skill.” A key related capacity is the ability to conduct and present data analysis in a way that another person can understand and replicate. The copy-and-paste workflow that is an artifact of antiquated user-interface design makes reproducibility of statistical analysis more difficult, especially as data become increasingly complex and statistical methods become increasingly sophisticated. R Markdown is a new technology that makes creating fully-reproducible statistical analysis simple and painless. It provides a solution suitable not only for cutting edge research, but also for use in an introductory statistics course. We present experiential and statistical evidence that R Markdown can be used effectively in introductory statistics courses, and discuss its role in the rapidly-changing world of statistical computation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

cERMIT is a computationally efficient motif discovery tool based on analyzing genome-wide quantitative regulatory evidence. Instead of pre-selecting promising candidate sequences, it utilizes information across all sequence regions to search for high-scoring motifs. We apply cERMIT on a range of direct binding and overexpression datasets; it substantially outperforms state-of-the-art approaches on curated ChIP-chip datasets, and easily scales to current mammalian ChIP-seq experiments with data on thousands of non-coding regions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A lateral on-chip electron-impact ion source utilizing a carbon nanotube field emission electron source was fabricated and characterized. The device consists of a cathode with aligned carbon nanotubes, a control grid, and an ion collector electrode. The electron-impact ionization of He, Ar, and Xe was studied as a function of field emission current and pressure. The ion current was linear with respect to gas pressure from 10-4 to 10-1 Torr. The device can operate as a vacuum ion gauge with a sensitivity of approximately 1 Torr-1. Ion currents in excess of 1 μA were generated. © 2007 American Institute of Physics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Family dogs and dog owners offer a potentially powerful way to conduct citizen science to answer questions about animal behavior that are difficult to answer with more conventional approaches. Here we evaluate the quality of the first data on dog cognition collected by citizen scientists using the Dognition.com website. We conducted analyses to understand if data generated by over 500 citizen scientists replicates internally and in comparison to previously published findings. Half of participants participated for free while the other half paid for access. The website provided each participant a temperament questionnaire and instructions on how to conduct a series of ten cognitive tests. Participation required internet access, a dog and some common household items. Participants could record their responses on any PC, tablet or smartphone from anywhere in the world and data were retained on servers. Results from citizen scientists and their dogs replicated a number of previously described phenomena from conventional lab-based research. There was little evidence that citizen scientists manipulated their results. To illustrate the potential uses of relatively large samples of citizen science data, we then used factor analysis to examine individual differences across the cognitive tasks. The data were best explained by multiple factors in support of the hypothesis that nonhumans, including dogs, can evolve multiple cognitive domains that vary independently. This analysis suggests that in the future, citizen scientists will generate useful datasets that test hypotheses and answer questions as a complement to conventional laboratory techniques used to study dog psychology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Software-based control of life-critical embedded systems has become increasingly complex, and to a large extent has come to determine the safety of the human being. For example, implantable cardiac pacemakers have over 80,000 lines of code which are responsible for maintaining the heart within safe operating limits. As firmware-related recalls accounted for over 41% of the 600,000 devices recalled in the last decade, there is a need for rigorous model-driven design tools to generate verified code from verified software models. To this effect, we have developed the UPP2SF model-translation tool, which facilitates automatic conversion of verified models (in UPPAAL) to models that may be simulated and tested (in Simulink/Stateflow). We describe the translation rules that ensure correct model conversion, applicable to a large class of models. We demonstrate how UPP2SF is used in themodel-driven design of a pacemaker whosemodel is (a) designed and verified in UPPAAL (using timed automata), (b) automatically translated to Stateflow for simulation-based testing, and then (c) automatically generated into modular code for hardware-level integration testing of timing-related errors. In addition, we show how UPP2SF may be used for worst-case execution time estimation early in the design stage. Using UPP2SF, we demonstrate the value of integrated end-to-end modeling, verification, code-generation and testing process for complex software-controlled embedded systems. © 2014 ACM.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Current methods for large-scale wind collection are unviable in urban areas. In order to investigate the feasibility of generating power from winds in these environments, we sought to optimize placements of small vertical-axis wind turbines in areas of artificially-generated winds. We explored both vehicular transportation and architecture as sources of artificial wind, using a combination of anemometer arrays, global positioning system (GPS), and weather report data. We determined that transportation-generated winds were not significant enough for turbine implementation. In addition, safety and administrative concerns restricted the implementation of said wind turbines along roadways for transportation-generated wind collection. Wind measurements from our architecture collection were applied in models that can help predict other similar areas with artificial wind, as well as the optimal placement of a wind turbine in those areas.