993 resultados para Fused deposition modeling
Resumo:
The spatial limits of the active site in the benzylic hydroxylase enzyme of the fungus Mortierella isabellina were investigated. Several molecular probes were used in incubation experiments to determine the acceptability of each compound by this enzyme. The yields of benzylic alcohols provided information on the acceptability of the particular compound into the active site, and the enantiomeric excess values provided information on the "fit" of acceptable substrates. Measurements of the molecular models were made using Cambridge Scientific Computing Inc. CSC Chem 3D Plus modeling program. i The dimensional limits of the aromatic binding pocket of the benzylic hydroxylase were tested using suitably substituted ethyl benzenes. Both the depth (para substituted substrates) and width (ortho and meta substituted substrates) of this region were investigated, with results demonstrating absolute spatial limits in both directions in the plane of the aromatic ring of 7.3 Angstroms for the depth and 7.1 Angstroms for the width. A minimum requirement for the height of this region has also been established at 6.2 Angstroms. The region containing the active oxygen species was also investigated, using a series of alkylphenylmethanes and fused ring systems in indan, 1,2,3,4-tetrahydronaphthalene and benzocycloheptene substrates. A maximum distance of 6.9 Angstroms (including the 1.5 Angstroms from the phenyl substituent to the active center of the heme prosthetic group of the enzyme) has been established extending directly in ii front of the aromatic binding pocket. The other dimensions in this region of the benzylic hydroxylase active site will require further investigation to establish maximum allowable values. An explanation of the stereochemical distributions in the obtained products has also been put forth that correlates well with the experimental observations.
Resumo:
Raman scattering in the region 20 to 100 cm -1 for fused quartz, "pyrex" boro-silicate glass, and soft soda-lime silicate glass was investigated. The Raman spectra for the fused quartz and the pyrex glass were obtained at room temperature using the 488 nm exciting line of a Coherent Radiation argon-ion laser at powers up to 550 mW. For the soft soda-lime glass the 514.5 nm exciting line at powers up to 660 mW was used because of a weak fluorescence which masked the Stokes Raman spectrum. In addition it is demonstrated that the low-frequency Raman coupling constant can be described by a model proposed by Martin and Brenig (MB). By fitting the predicted spectra based on the model with a Gaussian, Poisson, and Lorentzian forms of the correlation function, the structural correlation radius (SCR) was determined for each glass. It was found that to achieve the best possible fit· from each of the three correlation functions a value of the SCR between 0.80 and 0.90 nm was required for both quartz and pyrex glass but for the soft soda-lime silicate glass the required value of the SCR. was between 0.50 and 0.60 nm .. Our results support the claim of Malinovsky and Sokolov (1986) that the MB model based on a Poisson correlation function provides a universal fit to the experimental VH (vertical and horizontal polarizations) spectrum for any glass regardless of its chemical composition. The only deficiency of the MB model is its failure to fit the experimental depolarization spectra.
Resumo:
SrMg^Rui-iOa thin films were made by using pulsed laser deposition on SrTiOa (100) substrates in either O2 or Ar atmosphere. The thin films were characterized by x-ray diffraction, energy dispersive x-ray microanalysis, dc resistivity measurement, and dc magnetization measurement. The effect of Mg doping was observed. As soon as the amount of Mg increased in SrMg-cRui-iOa thin films, the magnetization decreased, and the resistivity increased. It had little effect on the Curie temperature (transition temperature). The magnetization states of SrMgiRui-iOa thin films, for x < 0.15, are similar to SrRuOs films. X-ray diffraction results for SrMga-Rui-iOa thin films made in oxygen showed that the films are epitaxial. The thin films could not be well made in Ar atmosphere during laser ablation as there was no clear peak of SrMg^Rui-iOa in x-ray diffraction results. Substrate temperatures had an effect on the resistivity of the films. The residual resistivity ratios were increased by increasing substrate temperature. It was observed that the thickness of thin films are another factor for film quality: Thin films were epitaxial, but thicker films were not epitaxial.
Resumo:
On Mars, interior layered deposits (ILD) provide evidence that water was once stable at the surface of the planet and present in large quantities. In West Candor Chasma, the ILD and their associated landforms record the depositional history of the chasma, and the deformation of those deposits provide insight into the stresses acting on them and the chasma as a whole. The post ILD structural history of West Candor is interpreted by analyzing the spatial relationships and orientation trends of structural features within the ILD. Therecording of stresses through brittle deformation of ILDs implies that the ILD had been lithified before the stress was imposed. Based on the prominent orientation trends of deformation features, the orientation of the stress regime acting upon the ILD appears to be linked to the regime that initially created the chasma-forming faults. An additional minor stress orientation was also revealed and may be related to large structures outside west Candor Chasma. The late depositional history of Ceti Mensa is herein investigated by examining the attributes and spatial relationship between unique corrugated, linear formations (CLF). The CLFs appear to be aeolian in origin but display clear indications of brittle deformation, indicating they have been Iithified. Evidence of lithification and the mineral composition of the surrounding material support the interpretation of circulating water in the area.
Resumo:
Experimental Extended X-ray Absorption Fine Structure (EXAFS) spectra carry information about the chemical structure of metal protein complexes. However, pre- dicting the structure of such complexes from EXAFS spectra is not a simple task. Currently methods such as Monte Carlo optimization or simulated annealing are used in structure refinement of EXAFS. These methods have proven somewhat successful in structure refinement but have not been successful in finding the global minima. Multiple population based algorithms, including a genetic algorithm, a restarting ge- netic algorithm, differential evolution, and particle swarm optimization, are studied for their effectiveness in structure refinement of EXAFS. The oxygen-evolving com- plex in S1 is used as a benchmark for comparing the algorithms. These algorithms were successful in finding new atomic structures that produced improved calculated EXAFS spectra over atomic structures previously found.
Resumo:
This paper develops a model of short-range ballistic missile defense and uses it to study the performance of Israel’s Iron Dome system. The deterministic base model allows for inaccurate missiles, unsuccessful interceptions, and civil defense. Model enhancements consider the trade-offs in attacking the interception system, the difficulties faced by militants in assembling large salvos, and the effects of imperfect missile classification by the defender. A stochastic model is also developed. Analysis shows that system performance can be highly sensitive to the missile salvo size, and that systems with higher interception rates are more “fragile” when overloaded. The model is calibrated using publically available data about Iron Dome’s use during Operation Pillar of Defense in November 2012. If the systems performed as claimed, they saved Israel an estimated 1778 casualties and $80 million in property damage, and thereby made preemptive strikes on Gaza about 8 times less valuable to Israel. Gaza militants could have inflicted far more damage by grouping their rockets into large salvos, but this may have been difficult given Israel’s suppression efforts. Counter-battery fire by the militants is unlikely to be worthwhile unless they can obtain much more accurate missiles.
Resumo:
In this paper, we introduce a new approach for volatility modeling in discrete and continuous time. We follow the stochastic volatility literature by assuming that the variance is a function of a state variable. However, instead of assuming that the loading function is ad hoc (e.g., exponential or affine), we assume that it is a linear combination of the eigenfunctions of the conditional expectation (resp. infinitesimal generator) operator associated to the state variable in discrete (resp. continuous) time. Special examples are the popular log-normal and square-root models where the eigenfunctions are the Hermite and Laguerre polynomials respectively. The eigenfunction approach has at least six advantages: i) it is general since any square integrable function may be written as a linear combination of the eigenfunctions; ii) the orthogonality of the eigenfunctions leads to the traditional interpretations of the linear principal components analysis; iii) the implied dynamics of the variance and squared return processes are ARMA and, hence, simple for forecasting and inference purposes; (iv) more importantly, this generates fat tails for the variance and returns processes; v) in contrast to popular models, the variance of the variance is a flexible function of the variance; vi) these models are closed under temporal aggregation.
Resumo:
Affiliation: Institut de recherche en immunologie et en cancérologie, Université de Montréal
Resumo:
Il a été démontré que l’hétérotachie, variation du taux de substitutions au cours du temps et entre les sites, est un phénomène fréquent au sein de données réelles. Échouer à modéliser l’hétérotachie peut potentiellement causer des artéfacts phylogénétiques. Actuellement, plusieurs modèles traitent l’hétérotachie : le modèle à mélange des longueurs de branche (MLB) ainsi que diverses formes du modèle covarion. Dans ce projet, notre but est de trouver un modèle qui prenne efficacement en compte les signaux hétérotaches présents dans les données, et ainsi améliorer l’inférence phylogénétique. Pour parvenir à nos fins, deux études ont été réalisées. Dans la première, nous comparons le modèle MLB avec le modèle covarion et le modèle homogène grâce aux test AIC et BIC, ainsi que par validation croisée. A partir de nos résultats, nous pouvons conclure que le modèle MLB n’est pas nécessaire pour les sites dont les longueurs de branche diffèrent sur l’ensemble de l’arbre, car, dans les données réelles, le signaux hétérotaches qui interfèrent avec l’inférence phylogénétique sont généralement concentrés dans une zone limitée de l’arbre. Dans la seconde étude, nous relaxons l’hypothèse que le modèle covarion est homogène entre les sites, et développons un modèle à mélanges basé sur un processus de Dirichlet. Afin d’évaluer différents modèles hétérogènes, nous définissons plusieurs tests de non-conformité par échantillonnage postérieur prédictif pour étudier divers aspects de l’évolution moléculaire à partir de cartographies stochastiques. Ces tests montrent que le modèle à mélanges covarion utilisé avec une loi gamma est capable de refléter adéquatement les variations de substitutions tant à l’intérieur d’un site qu’entre les sites. Notre recherche permet de décrire de façon détaillée l’hétérotachie dans des données réelles et donne des pistes à suivre pour de futurs modèles hétérotaches. Les tests de non conformité par échantillonnage postérieur prédictif fournissent des outils de diagnostic pour évaluer les modèles en détails. De plus, nos deux études révèlent la non spécificité des modèles hétérogènes et, en conséquence, la présence d’interactions entre différents modèles hétérogènes. Nos études suggèrent fortement que les données contiennent différents caractères hétérogènes qui devraient être pris en compte simultanément dans les analyses phylogénétiques.
Resumo:
A role for variant histone H2A.Z in gene expression is now well established but little is known about the mechanisms by which it operates. Using a combination of ChIP-chip, knockdown and expression profiling experiments, we show that upon gene induction, human H2A.Z associates with gene promoters and helps in recruiting the transcriptional machinery. Surprisingly, we also found that H2A.Z is randomly incorporated in the genome at low levels and that active transcription antagonizes this incorporation in transcribed regions. After cessation of transcription, random H2A.Z quickly reappears on genes, demonstrating that this incorporation utilizes an active mechanism. Within facultative heterochromatin, we observe a hyper accumulation of the variant histone, which might be due to the lack of transcription in these regions. These results show how chromatin structure and transcription can antagonize each other, therefore shaping chromatin and controlling gene expression.
Resumo:
Tesis (Doctor en Ciencias con Orientación en Procesos Sustentables) UANL, 2013.
Resumo:
Les systèmes Matériels/Logiciels deviennent indispensables dans tous les aspects de la vie quotidienne. La présence croissante de ces systèmes dans les différents produits et services incite à trouver des méthodes pour les développer efficacement. Mais une conception efficace de ces systèmes est limitée par plusieurs facteurs, certains d'entre eux sont: la complexité croissante des applications, une augmentation de la densité d'intégration, la nature hétérogène des produits et services, la diminution de temps d’accès au marché. Une modélisation transactionnelle (TLM) est considérée comme un paradigme prometteur permettant de gérer la complexité de conception et fournissant des moyens d’exploration et de validation d'alternatives de conception à des niveaux d’abstraction élevés. Cette recherche propose une méthodologie d’expression de temps dans TLM basée sur une analyse de contraintes temporelles. Nous proposons d'utiliser une combinaison de deux paradigmes de développement pour accélérer la conception: le TLM d'une part et une méthodologie d’expression de temps entre différentes transactions d’autre part. Cette synergie nous permet de combiner dans un seul environnement des méthodes de simulation performantes et des méthodes analytiques formelles. Nous avons proposé un nouvel algorithme de vérification temporelle basé sur la procédure de linéarisation des contraintes de type min/max et une technique d'optimisation afin d'améliorer l'efficacité de l'algorithme. Nous avons complété la description mathématique de tous les types de contraintes présentées dans la littérature. Nous avons développé des méthodes d'exploration et raffinement de système de communication qui nous a permis d'utiliser les algorithmes de vérification temporelle à différents niveaux TLM. Comme il existe plusieurs définitions du TLM, dans le cadre de notre recherche, nous avons défini une méthodologie de spécification et simulation pour des systèmes Matériel/Logiciel basée sur le paradigme de TLM. Dans cette méthodologie plusieurs concepts de modélisation peuvent être considérés séparément. Basée sur l'utilisation des technologies modernes de génie logiciel telles que XML, XSLT, XSD, la programmation orientée objet et plusieurs autres fournies par l’environnement .Net, la méthodologie proposée présente une approche qui rend possible une réutilisation des modèles intermédiaires afin de faire face à la contrainte de temps d’accès au marché. Elle fournit une approche générale dans la modélisation du système qui sépare les différents aspects de conception tels que des modèles de calculs utilisés pour décrire le système à des niveaux d’abstraction multiples. En conséquence, dans le modèle du système nous pouvons clairement identifier la fonctionnalité du système sans les détails reliés aux plateformes de développement et ceci mènera à améliorer la "portabilité" du modèle d'application.