930 resultados para Molecules - Models - Computer simulation
Resumo:
In this paper, we formulate a flexible density function from the selection mechanism viewpoint (see, for example, Bayarri and DeGroot (1992) and Arellano-Valle et al. (2006)) which possesses nice biological and physical interpretations. The new density function contains as special cases many models that have been proposed recently in the literature. In constructing this model, we assume that the number of competing causes of the event of interest has a general discrete distribution characterized by its probability generating function. This function has an important role in the selection procedure as well as in computing the conditional personal cure rate. Finally, we illustrate how various models can be deduced as special cases of the proposed model. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
The authors present here a summary of their investigations of ultrathin films formed by gold nanoclusters embedded in polymethylmethacrylate polymer. The clusters are formed from the self-organization of subplantated gold ions in the polymer. The source of the low energy ion stream used for the subplantation is a unidirectionally drifting gold plasma created by a magnetically filtered vacuum arc plasma gun. The material properties change according to subplantation dose, including nanocluster sizes and agglomeration state and, consequently also the material electrical behavior and optical activity. They have investigated the composite experimentally and by computer simulation in order to better understand the self-organization and the properties of the material. They present here the results of conductivity measurements and percolation behavior, dynamic TRIM simulations, surface plasmon resonance activity, transmission electron microscopy, small angle x-ray scattering, atomic force microscopy, and scanning tunneling microscopy. (C) 2010 American Vacuum Society [DOI: 10.1116/1.3357287]
Resumo:
The electronic properties of liquid hydrogen fluoride (HF) were investigated by carrying out sequential quantum mechanics/Born-Oppenheimer molecular dynamics. The structure of the liquid is in good agreement with recent experimental information. Emphasis was placed on the analysis of polarisation effects, dynamic polarisability and electronic excitations in liquid HF. Our results indicate an increase in liquid phase of the dipole moment (similar to 0.5 D) and isotropic polarisability (5%) relative to their gas-phase values. Our best estimate for the first vertical excitation energy in liquid HF indicates a blue-shift of 0.4 +/- 0.2 eV relative to that of the gas-phase monomer (10.4 eV). (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
Impurity-interstitial dipoles in calcium fluoride solutions with Al3+, Yb3+ and La3+ fluorides were studied using the thermally stimulated depolarization current (TSDC) technique. The dipolar complexes are formed by substitutional trivalent ions in Ca2+ sites and interstitial fluorine in nearest neighbor sites. The relaxations observed at 150 K are assigned to dipoles nnR(S)(3+)- F-i(-) (R-S = La or Yb). The purpose of this work is to study the processes of energy storage in the fluorides following X-ray and gamma irradiation. Computer modelling techniques are used to obtain the formation energy of dipole defects. (C) 2007 Elsevier Ltd. All rights reserved.
Resumo:
Relevant results for (sub-)distribution functions related to parallel systems are discussed. The reverse hazard rate is defined using the product integral. Consequently, the restriction of absolute continuity for the involved distributions can be relaxed. The only restriction is that the sets of discontinuity points of the parallel distributions have to be disjointed. Nonparametric Bayesian estimators of all survival (sub-)distribution functions are derived. Dual to the series systems that use minimum life times as observations, the parallel systems record the maximum life times. Dirichlet multivariate processes forming a class of prior distributions are considered for the nonparametric Bayesian estimation of the component distribution functions, and the system reliability. For illustration, two striking numerical examples are presented.
Resumo:
We consider the issue of assessing influence of observations in the class of beta regression models, which is useful for modelling random variables that assume values in the standard unit interval and are affected by independent variables. We propose a Cook-like distance and also measures of local influence under different perturbation schemes. Applications using real data are presented. (c) 2008 Elsevier B.V.. All rights reserved.
Resumo:
The viscosity of ionic liquids based on quaternary ammonium cations is reduced when one of the alkyl chains is replaced by an alkoxy chain (Zhou et al. Chem. Eur. J. 2005, 11, 752.). A microscopic picture of the role played by the ether function in decreasing the viscosity of quaternary ammonium ionic liquids is provided here by molecular dynamics (MD) simulations. A model for the ionic liquid N-ethyl-N,N-dimethyl-N-(2-methoxyethyl)ammonium bis(trifluoromethanesulfonyl)imide, MOENM(2)E TFSI, is compared to the tetraalky-lammonium counterpart. The alkoxy derivative has lower viscosity, higher ionic diffusion coefficients, and higher conductivity than the tetraalkyl system at the same density and temperature. A clear signature of the ether function on the liquid structure is observed in cation-cation correlations, but not in anion-anion or anion-cation correlations. In both the alkyl and the alkoxy ionic liquids, there is aggregation of long chains of neighboring cations within micelle-like structures. The MD simulations indicate that the less effective assembly between the more flexible alkoxy chains, in comparison to alkyl chains, is the structural reason for higher ionic mobility in MOENM(2)E TFSI.
Resumo:
The effect of adding SO(2) on the structure and dynamics of 1-butyl-3-methylimidazolium bromide (BMIBr) was investigated by low-frequency Raman spectroscopy and molecular dynamics (MD) simulations. The MD simulations indicate that the long-range structure of neat BMIBr is disrupted resulting in a liquid with relatively low viscosity and high conductivity, but strong correlation of ionic motion persists in the BMIBr-SO(2) mixture due to ionic pairing. Raman spectra within the 5 < omega < 200 cm(-1) range at low temperature reveal the short-time dynamics, which is consistent with the vibrational density of states calculated by MD simulations. Several time correlation functions calculated by MD simulations give further insights on the structural relaxation of BMIBr-SO(2).
Resumo:
Este trabalho de conclusão investiga o efeito da geração de estoques intermediários nos indicadores principais empregados na Teoria das Restrições (Ganho, Despesa Operacional e Inventário) em uma unidade industrial de processo produtivo de Propriedade contínuo, que emprega embalagens, matérias-primas obtidas em larga escala e cadeias logísticas de longo curso. Este tipo de indústria produz bens de consumo imediato, com pouca variabilidade, de modo “empurrado”. A principal conseqüência é a perda do sincronismo na cadeia logística, resultando em uma grande quantidade de estoques intermediários e custos crescentes, relacionados principalmente ao custo de manutenção destes estoques. Através dos cinco passos de focalização e das ferramentas lógicas da Teoria das Restrições, propõe-se uma alternativa gerencial, que inclui o algoritmo Tambor-Pulmão-Corda e insere a organização em um processo de melhoria contínua, cujos impactos são avaliados por simulação computacional. Através de técnicas estatísticas e software apropriados, constrói-se um modelo de simulação computacional baseado em dados reais de uma planta produtora de cimento. A partir deste modelo, diferentes cenários são testados, descobrindo-se a condição ótima. Chega-se a uma conclusão, considerando a mudança na política de geração de estoques intermediários e seus impactos na redução de custos e riscos.
Resumo:
Em redes de inovação baseadas em trocas de informação, o agente orquestrador se apropria das informações dos atores periféricos, gera inovação e distribui em forma de valor agregado. É sua função promover a estabilidade na rede fazendo com que a mesma tenha taxas não negativas de crescimento. Nos mercados de análise de crédito e fraude, por exemplo, ou bureaus funcionam como agentes orquestradores, concentrando as informações históricas da população que são provenientes de seus clientes e fornecendo produtos que auxiliam na tomada de decisão. Assumindo todas as empresas do ecossistema como agentes racionais, a teoria dos jogos se torna uma ferramenta apropriada para o estudo da precificação dos produtos como mecanismo de promoção da estabilidade da rede. Este trabalho busca identificar a relação de diferentes estruturas de precificação promovidas pelo agente orquestrador com a estabilidade e eficiência da rede de inovação. Uma vez que o poder da rede se dá pela força conjunta de seus membros, a inovação por esta gerada varia de acordo com a decisão isolada de cada agente periférico de contratar o agente orquestrador ao preço por ele estipulado. Através da definição de um jogo teórico simplificado onde diferentes agentes decidem conectar-se ou não à rede nas diferentes estruturas de preços estipuladas pelo agente orquestrador, o estudo analisa as condições de equilíbrio conclui que o equilíbrio de Nash implica em um cenário de estabilidade da rede. Uma conclusão é que, para maximizar o poder de inovação da rede, o preço a ser pago por cada agente para fazer uso da rede deve ser diretamente proporcional ao benefício financeiro auferido pela inovação gerada pela mesma. O estudo apresenta ainda uma simulação computacional de um mercado fictício para demonstração numérica dos efeitos observados. Através das conclusões obtidas, o trabalho cobre uma lacuna da literatura de redes de inovação com agentes orquestradores monopolistas em termos de precificação do uso da rede, servindo de subsídio de tomadores de decisão quando da oferta ou demanda dos serviços da rede.
Resumo:
Neste trabalho apresentamos um novo método numérico com passo adaptativo baseado na abordagem de linearização local, para a integração de equações diferenciais estocásticas com ruído aditivo. Propomos, também, um esquema computacional que permite a implementação eficiente deste método, adaptando adequadamente o algorítimo de Padé com a estratégia “scaling-squaring” para o cálculo das exponenciais de matrizes envolvidas. Antes de introduzirmos a construção deste método, apresentaremos de forma breve o que são equações diferenciais estocásticas, a matemática que as fundamenta, a sua relevância para a modelagem dos mais diversos fenômenos, e a importância da utilização de métodos numéricos para avaliar tais equações. Também é feito um breve estudo sobre estabilidade numérica. Com isto, pretendemos introduzir as bases necessárias para a construção do novo método/esquema. Ao final, vários experimentos numéricos são realizados para mostrar, de forma prática, a eficácia do método proposto, e compará-lo com outros métodos usualmente utilizados.
Resumo:
Trabalho apresentado no XXXV CNMAC, Natal-RN, 2014.
Resumo:
PEDRINI, Aldomar; WESTPHAL, F. S.; LAMBERT, R.. A methodology for building energy modelling and calibration in warm climates. Building And Environment, Australia, n. 37, p.903-912, 2002. Disponível em:
Resumo:
This thesis aims to describe and demonstrate the developed concept to facilitate the use of thermal simulation tools during the building design process. Despite the impact of architectural elements on the performance of buildings, some influential decisions are frequently based solely on qualitative information. Even though such design support is adequate for most decisions, the designer will eventually have doubts concerning the performance of some design decisions. These situations will require some kind of additional knowledge to be properly approached. The concept of designerly ways of simulating focuses on the formulation and solution of design dilemmas, which are doubts about the design that cannot be fully understood nor solved without using quantitative information. The concept intends to combine the power of analysis from computer simulation tools with the capacity of synthesis from architects. Three types of simulation tools are considered: solar analysis, thermal/energy simulation and CFD. Design dilemmas are formulated and framed according to the architect s reflection process about performance aspects. Throughout the thesis, the problem is investigated in three fields: professional, technical and theoretical fields. This approach on distinct parts of the problem aimed to i) characterize different professional categories with regards to their design practice and use of tools, ii) investigate preceding researchers on the use of simulation tools and iii) draw analogies between the proposed concept, and some concepts developed or described in previous works about design theory. The proposed concept was tested in eight design dilemmas extracted from three case studies in the Netherlands. The three investigated processes are houses designed by Dutch architectural firms. Relevant information and criteria from each case study were obtained through interviews and conversations with the involved architects. The practical application, despite its success in the research context, allowed the identification of some applicability limitations of the concept, concerning the architects need to have technical knowledge and the actual evolution stage of simulation tools