379 resultados para Toolkit
Resumo:
Stable isotope geochemistry is a valuable toolkit for addressing a broad range of problems in the geosciences. Recent technical advances provide information that was previously unattainable or provide unprecedented precision and accuracy. Two such techniques are site-specific stable isotope mass spectrometry and clumped isotope thermometry. In this thesis, I use site-specific isotope and clumped isotope data to explore natural gas development and carbonate reaction kinetics. In the first chapter, I develop an equilibrium thermodynamics model to calculate equilibrium constants for isotope exchange reactions in small organic molecules. This equilibrium data provides a framework for interpreting the more complex data in the later chapters. In the second chapter, I demonstrate a method for measuring site-specific carbon isotopes in propane using high-resolution gas source mass spectrometry. This method relies on the characteristic fragments created during electron ionization, in which I measure the relative isotopic enrichment of separate parts of the molecule. My technique will be applied to a range of organic compounds in the future. For the third chapter, I use this technique to explore diffusion, mixing, and other natural processes in natural gas basins. As time progresses and the mixture matures, different components like kerogen and oil contribute to the propane in a natural gas sample. Each component imparts a distinct fingerprint on the site-specific isotope distribution within propane that I can observe to understand the source composition and maturation of the basin. Finally, in Chapter Four, I study the reaction kinetics of clumped isotopes in aragonite. Despite its frequent use as a clumped isotope thermometer, the aragonite blocking temperature is not known. Using laboratory heating experiments, I determine that the aragonite clumped isotope thermometer has a blocking temperature of 50-100°C. I compare this result to natural samples from the San Juan Islands that exhibit a maximum clumped isotope temperature that matches this blocking temperature. This thesis presents a framework for measuring site-specific carbon isotopes in organic molecules and new constraints on aragonite reaction kinetics. This study represents the foundation of a future generation of geochemical tools for the study of complex geologic systems.
Resumo:
A engenharia geotécnica é uma das grandes áreas da engenharia civil que estuda a interação entre as construções realizadas pelo homem ou de fenômenos naturais com o ambiente geológico, que na grande maioria das vezes trata-se de solos parcialmente saturados. Neste sentido, o desempenho de obras como estabilização, contenção de barragens, muros de contenção, fundações e estradas estão condicionados a uma correta predição do fluxo de água no interior dos solos. Porém, como a área das regiões a serem estudas com relação à predição do fluxo de água são comumente da ordem de quilômetros quadrados, as soluções dos modelos matemáticos exigem malhas computacionais de grandes proporções, ocasionando sérias limitações associadas aos requisitos de memória computacional e tempo de processamento. A fim de contornar estas limitações, métodos numéricos eficientes devem ser empregados na solução do problema em análise. Portanto, métodos iterativos para solução de sistemas não lineares e lineares esparsos de grande porte devem ser utilizados neste tipo de aplicação. Em suma, visto a relevância do tema, esta pesquisa aproximou uma solução para a equação diferencial parcial de Richards pelo método dos volumes finitos em duas dimensões, empregando o método de Picard e Newton com maior eficiência computacional. Para tanto, foram utilizadas técnicas iterativas de resolução de sistemas lineares baseados no espaço de Krylov com matrizes pré-condicionadoras com a biblioteca numérica Portable, Extensible Toolkit for Scientific Computation (PETSc). Os resultados indicam que quando se resolve a equação de Richards considerando-se o método de PICARD-KRYLOV, não importando o modelo de avaliação do solo, a melhor combinação para resolução dos sistemas lineares é o método dos gradientes biconjugados estabilizado mais o pré-condicionador SOR. Por outro lado, quando se utiliza as equações de van Genuchten deve ser optar pela combinação do método dos gradientes conjugados em conjunto com pré-condicionador SOR. Quando se adota o método de NEWTON-KRYLOV, o método gradientes biconjugados estabilizado é o mais eficiente na resolução do sistema linear do passo de Newton, com relação ao pré-condicionador deve-se dar preferência ao bloco Jacobi. Por fim, há evidências que apontam que o método PICARD-KRYLOV pode ser mais vantajoso que o método de NEWTON-KRYLOV, quando empregados na resolução da equação diferencial parcial de Richards.
Resumo:
Recent research into the acquisition of spoken language has stressed the importance of learning through embodied linguistic interaction with caregivers rather than through passive observation. However the necessity of interaction makes experimental work into the simulation of infant speech acquisition difficult because of the technical complexity of building real-time embodied systems. In this paper we present KLAIR: a software toolkit for building simulations of spoken language acquisition through interactions with a virtual infant. The main part of KLAIR is a sensori-motor server that supplies a client machine learning application with a virtual infant on screen that can see, hear and speak. By encapsulating the real-time complexities of audio and video processing within a server that will run on a modern PC, we hope that KLAIR will encourage and facilitate more experimental research into spoken language acquisition through interaction. Copyright © 2009 ISCA.
Resumo:
When considering the potential uptake and utilization of technology management tools by industry, it must be recognized that companies face the difficult challenges of selecting, adopting and integrating individual tools into a toolkit that must be implemented within their current organizational processes and systems. This situation is compounded by the lack of sound advice on integrating well-founded individual tools into a robust toolkit that has the necessary degree of flexibility such that they can be tailored for application to specific problems faced by individual organizations. As an initial stepping stone to offering a toolkit with empirically proven utility, this paper provides a conceptual foundation to the development of toolkits by outlining an underlying philosophical position based on observations from multiple research and commercial collaborations with industry. This stance is underpinned by a set of operationalized principles that can offer guidance to organizations when deciding upon the appropriate form, functions and features that should be embodied by any potential tool/toolkit. For example, a key objective of any tool is to aid decision-making and a core set of powerful, flexible, scaleable and modular tools should be sufficient to allow users to generate, explore, shape and implement possible solutions across a wide array of strategic issues. From our philosophical stance, the preferred mode of engagement is facilitated workshops with a participatory process that enables multiple perspectives and structures the conversation through visual representations in order to manage the cognitive load in the collaborative environment. The generic form of the tools should be configurable for the given context and utilized in a lightweight manner based on the premise of start small and iterate fast. © 2011 IEEE.
Resumo:
Advances in genome technology have facilitated a new understanding of the historical and genetic processes crucial to rapid phenotypic evolution under domestication(1,2). To understand the process of dog diversification better, we conducted an extensive genome-wide survey of more than 48,000 single nucleotide polymorphisms in dogs and their wild progenitor, the grey wolf. Here we show that dog breeds share a higher proportion of multi-locus haplotypes unique to grey wolves from the Middle East, indicating that they are a dominant source of genetic diversity for dogs rather than wolves from east Asia, as suggested by mitochondrial DNA sequence data(3). Furthermore, we find a surprising correspondence between genetic and phenotypic/functional breed groupings but there are exceptions that suggest phenotypic diversification depended in part on the repeated crossing of individuals with novel phenotypes. Our results show that Middle Eastern wolves were a critical source of genome diversity, although interbreeding with local wolf populations clearly occurred elsewhere in the early history of specific lineages. More recently, the evolution of modern dog breeds seems to have been an iterative process that drew on a limited genetic toolkit to create remarkable phenotypic diversity.
Resumo:
Super-Resolution imaging techniques such as Fluorescent Photo-Activation Localisation Microscopy (FPALM) have created a powerful new toolkit for investigating living cells, however a simple platform for growing, trapping, holding and controlling the cells is needed before the approach can become truly widespread. We present a microfluidic device formed in polydimethylsiloxane (PDMS) with a fluidic design which traps cells in a high-density array of wells and holds them very still throughout the life cycle, using hydrodynamic forces only. The device meets or exceeds all the necessary criteria for FPALM imaging of Schizosaccharomyces pombe and is designed to remain flexible, robust and easy to use. © 2011 IEEE.
Resumo:
Time and budget constraints frequently prevent designers from consulting with end-users while assessing the ease of use of the products they create. This has resulted in solutions that are difficult to use by a wide range of users, especially the growing older adult population and people with different types of impairments. To help designers with this problem, capability-loss simulators have been developed with the aim of temporarily representing users who are otherwise difficult to access. This paper questions the reliability of existing tools in providing designers with meaningful information about the users' capabilities. Consequently, a new capability-loss simulation toolkit is presented, followed by its empirical evaluation. The new toolkit proved to be significantly helpful for a group of designers identifying real usability problems with everyday devices. © 2012 Copyright Taylor and Francis Group, LLC.
Resumo:
TRIZ (the theory of inventive problem solving) has been promoted by several enthusiasts as a systematic methodology or toolkit that provides a logical approach to developing creativity for innovation and inventive problem solving. The methodology, which emerged from Russia in the 1960s, has spread to over 35 countries across the world. It is now being taught in several universities and it has been applied by a number of global organisations who have found it particularly useful for spurring new product development. However, while its popularity and attractiveness appear to be on a steady increase, there are practical issues which make the use of TRIZ in practice particularly challenging. These practical difficulties have largely been neglected by TRIZ literature. This paper takes a step away from conventional TRIZ literature, by exploring not just the benefits associated with TRIZ knowledge, but the challenges associated with its acquisition and application based on practical experience. Through a survey, first-hand information is collected from people who have tried (successfully and unsuccessfully) to understand and apply the methodology. The challenges recorded cut across a number of issues, ranging from the complex nature of the methodology to underlying organisational and cultural issues which hinder its understanding and application. Another contribution of this paper, potentially useful for TRIZ beginners, is the indication of what tools among the several contained in the TRIZ toolkit would be most useful to learn first, based on their observed degree of usage by the survey respondents. © 2012 Elsevier Ltd. All rights reserved.
Resumo:
When considering the potential uptake and utilization of technology management tools by industry, it must be recognized that companies face the difficult challenges of selecting, adopting and integrating individual tools into a toolkit that must be implemented within their current organizational processes and systems. This situation is compounded by the lack of sound advice on integrating well-founded individual tools into a robust toolkit that has the necessary degree of flexibility such that they can be tailored for application to specific problems faced by individual organizations. As an initial stepping stone to offering a toolkit with empirically proven utility, this paper provides a conceptual foundation to the development of toolkits by outlining an underlying philosophical position based on observations from multiple research and commercial collaborations with industry. This stance is underpinned by a set of operationalized principles that can offer guidance to organizations when deciding upon the appropriate form, functions and features that should be embodied by any potential tool/toolkit. For example, a key objective of any tool is to aid decision-making and a core set of powerful, flexible, scaleable and modular tools should be sufficient to allow users to generate, explore, shape and implement possible solutions across a wide array of strategic issues. From our philosophical stance, the preferred mode of engagement is facilitated workshops with a participatory process that enables multiple perspectives and structures the conversation through visual representations in order to manage the cognitive load in the collaborative environment. The generic form of the tools should be configurable for the given context and utilized in a lightweight manner based on the premise of 'start small and iterate fast'. © 2012 Elsevier Inc.
Resumo:
Managers in technology-intensive businesses need to make decisions in complex and dynamic environments. Many tools, frameworks and processes have been developed to support managers in these situations, leading to a proliferation of such approaches, with little consistency in terminology or theoretical foundation, and a lack of understanding of how such tools can be linked together to tackle management challenges in an integrated way. As a step towards addressing these issues, this paper proposes the concept of an integrated 'toolkit', incorporating generalized forms of three core technology management tools that support strategic planning (roadmapping, portfolio analysis and linked analysis grids). © 2006 World Scientific Publishing Company.
Resumo:
随着网络技术和信息技术的飞速发展,互联网环境下的安全问题越来越受到 政府、军事和商业部门的重视。密码技术是信息安全的核心技术,密码算法的设 计和实现一直是信息安全学界的重要研究内容。近年来,随着计算技术和网络的 飞速发展,计算工具和模式发生了变化,出现了分布式计算、网格计算和云计算 等新技术。在国内外已经分别采取过大型机、并行计算和分布式计算等方式来提 高密码计算的速度。采用新技术来设计和实现新的密码计算平台,对密码算法的 设计、分析以及应用有重要意义。分布式计算和网格计算在信息安全领域中的应 用是本文的主要研究内容。 首先,通过分析密码计算和网格计算的特点,确定了本文的研究目标,即构 建一个通用的、高效的、可扩展的、可移植的分布式计算环境。其次,分析研究 目标的特点以及遇到的问题,详细讨论系统所采用解决方案的特点及优势。在系 统分析设计的基础上,基于Globus Toolkit 和SWT/JFace 工具包,对Gnomon 分 布式密码计算环境进行了实现,并详细介绍了各个模块的功能。最后,设计和实 现了两个问题实例:大整数因子分解和Rainbow 攻击。针对问题实例,进行了多 组实验,并给出了相应的实验结果和分析。 Gnomon 分布式计算环境为密码的分布式计算提供支持,其易用性和可扩展 性为密码分析和设计人员带来了方便。本文研究成果推动了分布式密码计算的研 究与发展。
Resumo:
并行计算模型的发展引入越来越多的模型参数。对并行计算模型参数动态采集分析软件包DEMPAT的整体框架进行研究,实现基于硬件性能计数器的存储层次参数采集模块。实验表明,该模块能够准确快速地获取存储层次参数且具有较好的可移植性。
Resumo:
侧信道攻击是密码分析研究的一个重要分支。研究实践表明,即使密码算法在数学意义下是安全的,不恰当的任何实现所泄漏的侧信道信息仍会导致严重的安全隐患。能量分析攻击就是这样一种功能强大的典型侧信道攻击。这种攻击方法实际效果显著,受到广泛关注,迅速成为侧信道攻击领域的研究热点问题。本文对能量分析攻击有效性评估的基础方法和关键技术进行研究,旨在通过评估能量分析攻击的有效性来认识能量分析攻击的严重威胁,进而为密码系统的设计和分析提供必要的基础方法和支撑工具。本文的主要贡献如下: 第一、给出了高斯区分器的形式化定义,刻画了两个典型区分器的统计特性;提出了区分度量化度量指标,用以评估一类典型差分能量分析攻击的有效性。区分器是差分能量分析攻击中不可或缺的关键部件,它在很大程度上决定了能量分析攻击的有效性。为刻画区分器的统计特性,提出了高斯区分器的概念,给出了相应的形式化定义,并研究了两类典型区分器的实际特性;基于高斯区分器提出度量指标区分度,可以方便地对典型差分能量分析攻击的有效性进行量化度量,部分解决了已有同类度量指标应用困难的问题。最后,论文通过大量的模拟实验,验证了这种刻画方式和度量方法的合理性与可行性。 第二、基于对典型能量分析攻击进行模拟分析和评估研究工作的客观需要,设计了一个通用的差分能量分析攻击框架,研制出一个可扩展的DPA模拟分析工具集DPA Toolkit。该工具集支持基于均值差检验、皮尔逊相关系数、贝叶斯决策等典型区分器的DPA模拟攻击,并且可以对相关模拟攻击结果进行初步评估,为密码模块抵御能量分析攻击能力评估提供了一种基础技术支撑工具,亦为进一步研制侧信道攻击与评估综合实验平台提供借鉴。
Resumo:
由于密码学和信息安全领域的许多问题最终都被转化为一个耗时的计算,其中许多计算需要利用多台异构的和地理分布的计算机协同,才能有效完成.密码算法的设计、分析和应用对于计算环境敏感,且依赖性较强,不同类型的算法和算法的不同实现模式对计算环境要求差异很大,而且到目前为止还不存在一种通用的分布式密码计算模型.为此,本文根据密码计算本身的需求,首先分别分析了密码算法设计、分析和应用的目标和特征,提出了相应的计算模式,给出了一种网格环境下的通用密码计算模型.进而讨论了密码计算任务分割策略,资源分配和负载平衡问题.最后给出了网格环境Globus Toolkit下的模型构架、实现与实验结果.
Resumo:
笔式用户界面是一种重要的Post-WIMP(window icon menu pointer)界面,它给用户提供了自然的交互方式.然而,当前的笔式用户界面工具箱大多是面向单用户任务的,不能很好地支持协作应用场景.通过对笔式交互特征和协作环境功能需求的分析,设计并实现了一个工具箱CoPen Toolkit,用于支持协作笔式用户界面的开发.它提供了灵活的架构和可扩展的组件,支持笔迹描述、事件处理和网络协作等功能.基于CoPen Toolkit,构造了多个原型系统,实践表明,它能够很好地支持协作笔式用户界面的开发.