412 resultados para Toolkit


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Fibrillar collagens provide the most fundamental platform in the vertebrate organism for the attachment of cells and matrix molecules. we have identified specific sites in collagens to which cells can attach, either directly or through protein intermediaries. Using Toolkits of triple-helical peptides, each peptide comprising 27 residues of collagen primary sequence and overlapping with its neighbours by nine amino acids, we have mapped the binding of receptors and other proteins on to collagens II or III. Integrin alpha 2 beta 1 binds to several GXX'GER motifs within the collagens, the affinities of which differ sufficiently to control cell adhesion and migration independently of the cellular regulation of the integrin. The platelet receptor, Gp (glycoprotein) VI binds well to GPO (where 0 is hydroxyproline)-containing model peptides, but to very few Toolkit peptides, suggesting that sequence in addition to GPO triplets is important in defining GpVI binding. The Toolkits have been applied to the plasma protein vWF (von Willebrand factor), which binds to only a single sequence, identified by truncation and amino acid substitution within Toolkit peptides, as GXRGQOGVMGFO in collagens II and III. Intriguingly, the receptor tyrosine kinase, DDR2 (discoidin domain receptor 2) recognizes three sites in collagen II, including its vWF-binding site, although the amino acids that support the interaction differ slightly within this motif. Furthermore, the secreted protein BM-40 (basement membrane protein 40) also binds well to this same region. Thus the availability of extracellular collagen-binding proteins may be important in regulating and facilitating direct collagen-receptor interaction.

Relevância:

10.00% 10.00%

Publicador:

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Diffuse contaminants can make their way into rivers via a number of different pathways, including overland flow, interflow, and shallow and deep groundwater. Identification of the key pathway(s) delivering contaminants to a receptor is important for implementing effective water management strategies. The ‘Pathways Project’, funded by the Irish Environmental Protection Agency, is developing a catchment management tool that will enable practitioners to identify the critical source areas for diffuse contaminants, and the key pathways of interest in assessing contaminant problems on a catchment and sub-catchment scale.
One of the aims of the project is to quantify the flow and contaminant loadings being delivered to the stream via each of the main pathways. Chemical separation of stream event hydrographs is being used to supplement more traditional physical hydrograph separation methods. Distinct, stable chemical signatures are derived for each of the pathway end members, and the proportion of flow from each during a rainfall event can be determined using a simple mass balance approach.
Event sampling was carried out in a test catchment underlain by poorly permeable soils and bedrock, which is predominantly used for grazing with a number of one-off rural residential houses. Results show that artificial field drainage, which includes subterranean land drains and collector drains around the perimeters of the 1 to 10 ha fields, plays an important role in the delivery of flow and nutrients to the streams in these types of hydrogeological settings.
Nitrate infiltrates with recharge and is delivered to the stream primarily via the artificial drains and the shallow groundwater pathway. Longitudinal stream profiles show that the nitrate load input is relatively uniform over the 8 km length of the stream at high flows, suggesting widespread diffuse contaminant input. In contrast, phosphorus is adsorbed in the clay-rich soil and is transported mainly via the overland flow pathway and the artificial drains. Longitudinal stream profiles for phosphorus suggest a pattern of more discrete points of phosphorus inputs, which may be related to point sources of contamination.
These techniques have application elsewhere within a toolkit of methods for determining the key pathways delivering contaminants to surface water receptors.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Although a substantial corpus of digital materials is now available to scholarship across the disciplines, objective evidence of their use, impact, and value, based on a robust assessment, is sparse. Traditional methods of assessment of impact in the humanities, notably citation in scholarly publications, are not an effective way of assessing impact of digital content. These issues are problematic in the field of Digital Humanities where there is a need to effectively assess impact to justify its continued funding and existence. A number of qualitative and quantitative methods exist that can be used to monitor the use of digital resources in various contexts although they have yet to be applied widely. These have been made available to the creators, managers, and funders of digital content in an accessible form through the TIDSR (Toolkit for the Impact of Digital Scholarly Resources) developed by the Oxford Internet Institute. In 2011, the authors of this article developed the SPHERE project (Stormont Parliamentary Hansards: Embedded in Research and Education) specifically to use TIDSR to evaluate the use and impact of The Stormont Papers, a digital collection of the Hansards of the Stormont Northern Irish Parliament from 1921 to 1972. This article presents the methodology, findings, and analysis of the project. The authors argue that TIDSR is a useful and, critically, transferrable method to understand and increase the impact of digital resources. The findings of the project are modified into a series of wider recommendations on protecting the investment in digital resources by increasing their use, value, and impact. It is reasonable to suggest that effectively showing the impact of Digital Humanities is critical to its survival.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dendritic molecules have well defined, three-dimensional branched architectures, and constitute a unique nanoscale toolkit. This review focuses on examples in which individual dendritic molecules are assembled into more complex arrays via non-covalent interactions. In particular, it illustrates how the structural information programmed into the dendritic architecture controls the assembly process, and as a consequence, the properties of the supramolecular structures which are generated. Furthermore, the review emphasises how the use of non-covalent (supramolecular) interactions, provides the assembly process with reversibility, and hence a high degree of control. The review also illustrates how self-assembly offers an ideal approach for amplifying the branching of small, synthetically accessible, relatively inexpensive dendritic systems (e.g. dendrons), into highly branched complex nanoscale assemblies.

The review begins by considering the assembly of dendritic molecules to generate discrete, well-defined supramolecular assemblies. The variety of possible assembled structures is illustrated, and the ability of an assembled structure to encapsulate a templating unit is described. The ability of both organic and inorganic building blocks to direct the assembly process is discussed. The review then describes larger discrete assemblies of dendritic molecules, which do not exist as a single well-defined species, but instead exist as statistical distributions. For example, assembly around nanoparticles, the assembly of amphiphilic dendrons and the assembly of dendritic systems in the presence of DNA will all be discussed. Finally, the review examines dendritic molecules, which assemble or order themselves into extended arrays. Such systems extend beyond the nanoscale into the microscale or even the macroscale domain, exhibiting a wide range of different architectures. The ability of these assemblies to act as gel-phase or liquid crystalline materials will be considered.

Taken as a whole, this review emphasises the control and tunability that underpins the assembly of nanomaterials using dendritic building blocks, and furthermore highlights the potential future applications of these assemblies at the interfaces between chemistry, biology and materials science. 

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Identifying responsibility for classes in object oriented software design phase is a crucial task. This paper proposes an approach for producing high quality and robust behavioural diagrams (e.g. Sequence Diagrams) through Class Responsibility Assignment (CRA). GRASP or General Responsibility Assignment Software Pattern (or Principle) was used to direct the CRA process when deriving behavioural diagrams. A set of tools to support CRA was developed to provide designers and developers with a cognitive toolkit that can be used when analysing and designing object-oriented software. The tool developed is called Use Case Specification to Sequence Diagrams (UC2SD). UC2SD uses a new approach for developing Unified Modelling Language (UML) software designs from Natural Language, making use of a meta-domain oriented ontology, well established software design principles and established Natural Language Processing (NLP) tools. UC2SD generates a well-formed UML sequence diagrams as output.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Heckman-type selection models have been used to control HIV prevalence estimates for selection bias when participation in HIV testing and HIV status are associated after controlling for observed variables. These models typically rely on the strong assumption that the error terms in the participation and the outcome equations that comprise the model are distributed as bivariate normal.
Methods: We introduce a novel approach for relaxing the bivariate normality assumption in selection models using copula functions. We apply this method to estimating HIV prevalence and new confidence intervals (CI) in the 2007 Zambia Demographic and Health Survey (DHS) by using interviewer identity as the selection variable that predicts participation (consent to test) but not the outcome (HIV status).
Results: We show in a simulation study that selection models can generate biased results when the bivariate normality assumption is violated. In the 2007 Zambia DHS, HIV prevalence estimates are similar irrespective of the structure of the association assumed between participation and outcome. For men, we estimate a population HIV prevalence of 21% (95% CI = 16%–25%) compared with 12% (11%–13%) among those who consented to be tested; for women, the corresponding figures are 19% (13%–24%) and 16% (15%–17%).
Conclusions: Copula approaches to Heckman-type selection models are a useful addition to the methodological toolkit of HIV epidemiology and of epidemiology in general. We develop the use of this approach to systematically evaluate the robustness of HIV prevalence estimates based on selection models, both empirically and in a simulation study.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present a simple model for a component of the radiolytic production of any chemical species due to electron emission from irradiated nanoparticles (NPs) in a liquid environment, provided the expression for the G value for product formation is known and is reasonably well characterized by a linear dependence on beam energy. This model takes nanoparticle size, composition, density and a number of other readily available parameters (such as X-ray and electron attenuation data) as inputs and therefore allows for the ready determination of this contribution. Several approximations are used, thus this model provides an upper limit to the yield of chemical species due to electron emission, rather than a distinct value, and this upper limit is compared with experimental results. After the general model is developed we provide details of its application to the generation of HO(•) through irradiation of gold nanoparticles (AuNPs), a potentially important process in nanoparticle-based enhancement of radiotherapy. This model has been constructed with the intention of making it accessible to other researchers who wish to estimate chemical yields through this process, and is shown to be applicable to NPs of single elements and mixtures. The model can be applied without the need to develop additional skills (such as using a Monte Carlo toolkit), providing a fast and straightforward method of estimating chemical yields. A simple framework for determining the HO(•) yield for different NP sizes at constant NP concentration and initial photon energy is also presented.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recent improvements in the speed, cost and accuracy of next generation sequencing are revolutionizing the discovery of single nucleotide polymorphisms (SNPs). SNPs are increasingly being used as an addition to the molecular ecology toolkit in nonmodel organisms, but their efficient use remains challenging. Here, we discuss common issues when employing SNP markers, including the high numbers of markers typically employed, the effects of ascertainment bias and the inclusion of nonneutral loci in a marker panel. We provide a critique of considerations specifically associated with the application and population genetic analysis of SNPs in nonmodel taxa, focusing specifically on some of the most commonly applied methods.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper builds on previous work to show how using holistic and iterative design optimisation tools can be used to produce a commercially viable product that reduces a costly assembly into a single moulded structure. An assembly consisting of a structural metallic support and compression moulding outer shell undergo design optimisation and analysis to remove the support from the assembly process in favour of a structural moulding. The support is analysed and a sheet moulded compound (SMC) alternative is presented, this is then combined into a manufacturable shell design which is then assessed on viability as an alternative to the original.
Alongside this a robust material selection system is implemented that removes user bias towards materials for designs. This system builds on work set out by the Cambridge Material Selector and Boothroyd and Dewhurst, while using a selection of applicable materials currently available for the compression moulding process. This material selection process has been linked into the design and analysis stage, via scripts for use in the finite element environment. This builds towards an analysis toolkit that is suggested to develop and enhance manufacturability of design studies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Uma descrição detalhada do processo de electroluminescência é um prérequisito na optimização de detectores gasosos para sistemas de imagiologia, astrofísica, física de altas energias e experiências de eventos raros. Neste trabalho, é apresentada e caracterizada uma nova e versátil plataforma de simulação da emissão de luz durante a deriva de electrões em gases nobres, desenvolvida usando os programas Magboltz e Garfield. Propriedades intrínsecas da electroluminescência em gases nobres são calculadas e apresentadas em função do campo eléctrico aplicado, nomeadamente eficiências, rendimento e flutuações estatísticas associadas. São obtidos resultados em grande concordância com dados experimentais e simulações Monte Carlo anteriores. A plataforma é usada para determinar as condições óptimas de funcionamento de detectores como o NEXT (Neutrino Experiment with a Xenon TPC) e outros baseados nas micro-estruturas GEM (Gas Electron Multiplier) e MHSP (Micro- Hole & Strip Plate).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Os desafios dos designers nos dias de hoje são inúmeros, sendo um deles o de desenvolver produtos que atendam ao mercado sem deixar de lado as preocupações com a sustentabilidade, seja ambiental, sócio-ética ou econômica. Assim sendo, a própria disputa para alcançar novos patamares de destaque no mercado também fez proliferar novas tecnologias de produção sustentável como um fator diferenciador nos dias de hoje que, para além de preservar a natureza, conquista uma larga faixa de clientes sensíveis a este tipo de preocupações. É objetivo deste trabalho a criação de produtos de design de moda e decoração e sua verificação enquanto design sustentável. Neste contexto, enquadra-se este trabalho que visa desenvolver produtos de moda e decoração realizados com a técnica da colagem têxtil, resultando assim, em produtos inovadores, únicos, intemporais e com design. A presente investigação desenvolveu-se em parceria com o Banco de Vestuário de Caxias do Sul, que forneceu dados para o estudo do setor têxtil e levantamento dos resíduos da indústria têxtil da região para caracterização do setor, além de disponibilizar os resíduos têxteis que serviram de base para a criação de produtos de moda e de decoração mais sustentáveis. Para a criação das superfícies têxteis coladas foram desenvolvidas amostras com testes de temperaturas e tempos para verificar a combinação que mais se adequa a cada tipo de matéria-prima. Além de fazer uso destes resíduos, a investigação contou com o auxílio de mão-de-obra de artesãs que, por sua vez, foram capacitadas com a técnica da colagem têxtil através de dois workshops. Após esta etapa, para fazer a análise dos níveis de sustentabilidade dos produtos, utilizou-se o Sustainability Design Orienting Toolkit (SDO) que é um conjunto de ferramentas que visa a orientação dos designers no desenvolvimento de produtos em termos de sustentabilidade ambiental, sócio-ética e econômica.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Thesis (Master's)--University of Washington, 2015

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Public Display Systems (PDS) increasingly have a greater presence in our cities. These systems provide information and advertising specifically tailored to audiences in spaces such as airports, train stations, and shopping centers. A large number of public displays are also being deployed for entertainment reasons. Sometimes designing and prototyping PDS come to be a laborious, complex and a costly task. This dissertation focuses on the design and evaluation of PDS at early development phases with the aim of facilitating low-effort, rapid design and the evaluation of interactive PDS. This study focuses on the IPED Toolkit. This tool proposes the design, prototype, and evaluation of public display systems, replicating real-world scenes in the lab. This research aims at identifying benefits and drawbacks on the use of different means to place overlays/virtual displays above a panoramic video footage, recorded at real-world locations. The means of interaction studied in this work are on the one hand the keyboard and mouse, and on the other hand the tablet with two different techniques of use. To carry out this study, an android application has been developed whose function is to allow users to interact with the IPED Toolkit using the tablet. Additionally, the toolkit has been modified and adapted to tablets by using different web technologies. Finally the users study makes a comparison about the different means of interaction.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The existing parking simulations, as most simulations, are intended to gain insights of a system or to make predictions. The knowledge they have provided has built up over the years, and several research works have devised detailed parking system models. This thesis work describes the use of an agent-based parking simulation in the context of a bigger parking system development. It focuses more on flexibility than on fidelity, showing the case where it is relevant for a parking simulation to consume dynamically changing GIS data from external, online sources and how to address this case. The simulation generates the parking occupancy information that sensing technologies should eventually produce and supplies it to the bigger parking system. It is built as a Java application based on the MASON toolkit and consumes GIS data from an ArcGis Server. The application context of the implemented parking simulation is a university campus with free, on-street parking places.