902 resultados para Hemerythrin Model Complex
Resumo:
In various attempts to relate the behaviour of highly-elastic liquids in complex flows to their rheometrical behaviour, obvious candidates for study have been the variation of shear viscosity with shear rate, the two normal stress differences N(1) and N(2), especially N(1), the extensional viscosity, and the dynamic moduli G` and G ``. In this paper, we shall confine attention to `constant-viscosity` Boger fluids, and, accordingly, we shall limit attention to N(1), eta(E), G` and G ``. We shall concentrate on the ""splashing"" problem (particularly that which arises when a liquid drop falls onto the free surface of the same liquid). Modern numerical techniques are employed to provide the theoretical predictions. We show that high eta(E) can certainly reduce the height of the so-called Worthington jet, thus confirming earlier suggestions, but other rheometrical influences (steady and transient) can also have a role to play and the overall picture may not be as clear as it was once envisaged. We argue that this is due in the main to the fact that splashing is a manifestly unsteady flow. To confirm this proposition, we obtain numerical simulations for the linear Jeffreys model. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Der erste Teil der hier vorgestellten Arbeit verfolgt die Synthese potentieller Modellverbindungen oligonuklearer Metalloproteine auf Basis von Salen-Liganden. Dazu wurden zwei Ligandensysteme mit unterschiedlich raumerfüllenden Alkyl-Substituenten modifiziert und auf ihre koordinativen Eigenschaften hin untersucht. Für das Ligandensystem auf Basis des Bis-(salicylidenamino)-propan-2-ols konnten fünf Derivate (H3L1, H3L2A,H3L2B, H3L3, H3L4), für das zweite verwendete Ligandensystem auf Basis des 1H-3,5-Bis-(salicylidenaminomethyl)-pyrazols konnten zwei weitere Derivate (H3L5A, H3L5B) dargestellt und zu Koordinationsverbindungen umgesetzt werden.rnFür den hier verwendeten Bis-(salicylidenamino)-propan-2-ol Liganden H3L1, welcher die geringsten sterischen Anforderungen stellt, konnten mono-, tri- und tetranukleare Koordinationsverbindungen synthetisiert werden. Dabei gelingt es dem Liganden, sich sowohl in planarer als auch in unterschiedlich stark gewinkelter Konformation um ein oder mehrere Metallzentren anzuordnen, wobei der Ligand ein N2O2- seines N2O3-Donorsets zur Koordination nutzt. Die Verbindung {[Ni7(HL1)2(L1)2(OBz)4(OMe)(H2O)]}n zeigt, dass eine Verkettung der so gestalteten dreikernigen Einheiten über das freie Propanol-Sauerstoffatomdes Ligandenrückgrats möglich ist. Mit zunehmendem sterischen Anspruch der angefügten Alkylsubstituenten nimmt die geometrische Flexibilität und somit das Potential des Liganden zur Ausbildung höhernuklearer Strukturen ab. So ist für Liganden mit mittlerem sterischen Anspruch neben mononuklearen Komplexen noch die Gestaltung dinuklearer Systeme möglich. Erhöht man den sterischen Anspruch des Liganden weiter, findet nur noch eine Reaktion zu mononuklearen Verbindungen statt.rnMit den Pyrazol-basierten Ligandensystemen H3L5A und H3L5B konnten dinukleare Kupfer- und Nickelverbindungen synthetisiert werden.rnDer zweite Teil dieser Arbeit befasst sich mit der Gestaltung von Spin-Crossover Systemen (SCO). Dazu soll ein Spinübergang innerhalb des gestalteten schaltbaren Systems an die Anwesenheit eines Signalstoffs gekoppelt werden, so dass diese SCO-Verbindung als Sensor für den Signalstoff eingesetzt werden kann. Dazu wurden zwei unterschiedliche Ansätze entwickelt und untersucht.rnDie erste Methode beruht auf der Kombination eines zum Spin-Crossover befähigten Metallzentrums, eines Capping-Liganden, eines zur Signalstofferkennung funktionalisierten Co-Liganden sowie eines entsprechenden Signalstoffs. Als Capping-Liganden wurden tetra- und pentadentateLigandensysteme eingesetzt und mit unterschiedlich Picolyl-substituierten Monoaza-[12]-krone-4-Derivaten umgesetzt, wobei die Monoazakrone zur Komplexierung des Signalstoffs,hier in Form eines Alkalimetallions, zur Verfügung steht. Nach dieser ersten Methode konnten im Zeitraum dieser Arbeit noch keine zufriedenstellenden Ergebnisse erzielt werden.rnEine vielversprechende zweite Möglichkeit beruht auf der Verwendung eines mehrzähnigen, etablierten Spin-Crossover Liganden,welcher in seiner Peripherie mit einer Bindungstasche zur Aufnahme des Signalstoffmodifiziert wird.Mit Hilfe des so gestalteten Liganden 4'-(4'''-Benzo-[15]-krone-5)-methyloxy-2,2':6',2''-terpyridin ([b15c5]-tpy) gelang die Umsetzung zu entsprechenden Eisen(II)- und Kobalt(II)komplexen der Zusammensetzung [M([b15c5]-tpy)2]2+. Alle synthetisierten Eisen(II)-Komplexe liegen aufgrund der hohen Ligandenfeldstärke des Terpyridins über einen Temperaturbereich von 300 – 400 K in ihrer diamagnetischen Low Spin Form vor. Die entsprechenden Kobalt(II)-Komplexe zeigen über einen Temperaturbereich von 2 – 350 K ein kontinuierliches, aber unvollständiges Spin-Crossover Verhalten.rnDer Einfluss von Signalstoffen auf das Spin-Crossover Verhalten der Kobalt(II)-Systeme wurde in einem ersten Versuch unter der Verwendung von Natriumionen als Signalstoff untersucht. Dabei stellte sich heraus, dass Natriumionen für dieses System zwar nicht als Auslöser eines SCO verwendet werden können, sie aber dennoch eine starke Auswirkung auf den Verlauf des Spin-Crossovers haben.
Resumo:
Many multifactorial biologic effects, particularly in the context of complex human diseases, are still poorly understood. At the same time, the systematic acquisition of multivariate data has become increasingly easy. The use of such data to analyze and model complex phenotypes, however, remains a challenge. Here, a new analytic approach is described, termed coreferentiality, together with an appropriate statistical test. Coreferentiality is the indirect relation of two variables of functional interest in respect to whether they parallel each other in their respective relatedness to multivariate reference data, which can be informative for a complex effect or phenotype. It is shown that the power of coreferentiality testing is comparable to multiple regression analysis, sufficient even when reference data are informative only to a relatively small extent of 2.5%, and clearly exceeding the power of simple bivariate correlation testing. Thus, coreferentiality testing uses the increased power of multivariate analysis, however, in order to address a more straightforward interpretable bivariate relatedness. Systematic application of this approach could substantially improve the analysis and modeling of complex phenotypes, particularly in the context of human study where addressing functional hypotheses by direct experimentation is often difficult.
Resumo:
The ability to grow microscopic spherical birefringent crystals of vaterite, a calcium carbonate mineral, has allowed the development of an optical microrheometer based on optical tweezers. However, since these crystals are birefringent, and worse, are expected to have non-uniform birefringence, computational modeling of the microrheometer is a highly challenging task. Modeling the microrheometer - and optical tweezers in general - typically requires large numbers of repeated calculations for the same trapped particle. This places strong demands on the efficiency of computational methods used. While our usual method of choice for computational modelling of optical tweezers - the T-matrix method - meets this requirement of efficiency, it is restricted to homogeneous isotropic particles. General methods that can model complex structures such as the vaterite particles, such as finite-difference time-domain (FDTD) or finite-difference frequency-domain (FDFD) methods, are inefficient. Therefore, we have developed a hybrid FDFD/T-matrix method that combines the generality of volume-discretisation methods such as FDFD with the efficiency of the T-matrix method. We have used this hybrid method to calculate optical forces and torques on model vaterite spheres in optical traps. We present and compare the results of computational modelling and experimental measurements.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-08
Resumo:
Unstructured mesh based codes for the modelling of continuum physics phenomena have evolved to provide the facility to model complex interacting systems. Such codes have the potential to provide a high performance on parallel platforms for a small investment in programming. The critical parameters for success are to minimise changes to the code to allow for maintenance while providing high parallel efficiency, scalability to large numbers of processors and portability to a wide range of platforms. The paradigm of domain decomposition with message passing has for some time been demonstrated to provide a high level of efficiency, scalability and portability across shared and distributed memory systems without the need to re-author the code into a new language. This paper addresses these issues in the parallelisation of a complex three dimensional unstructured mesh Finite Volume multiphysics code and discusses the implications of automating the parallelisation process.
Resumo:
Unstructured mesh codes for modelling continuum physics phenomena have evolved to provide the facility to model complex interacting systems. Parallelisation of such codes using single Program Multi Data (SPMD) domain decomposition techniques implemented with message passing has been demonstrated to provide high parallel efficiency, scalability to large numbers of processors P and portability across a wide range of parallel platforms. High efficiency, especially for large P requires that load balance is achieved in each parallel loop. For a code in which loops span a variety of mesh entity types, for example, elements, faces and vertices, some compromise is required between load balance for each entity type and the quantity of inter-processor communication required to satisfy data dependence between processors.
Resumo:
This article presents a tool for the allocation analysis of complex systems of water resources, called AcquaNetXL, developed in the form of spreadsheet in which a model of linear optimization and another nonlinear were incorporated. The AcquaNetXL keeps the concepts and attributes of a decision support system. In other words, it straightens out the communication between the user and the computer, facilitates the understanding and the formulation of the problem, the interpretation of the results and it also gives a support in the process of decision making, turning it into a clear and organized process. The performance of the algorithms used for solving the problems of water allocation was satisfactory especially for the linear model.
Resumo:
In this study, twenty hydroxylated and acetoxylated 3-phenylcoumarin derivatives were evaluated as inhibitors of immune complex-stimulated neutrophil oxidative metabolism and possible modulators of the inflammatory tissue damage found in type III hypersensitivity reactions. By using lucigenin- and luminol-enhanced chemiluminescence assays (CL-luc and CL-lum, respectively), we found that the 6,7-dihydroxylated and 6,7-diacetoxylated 3-phenylcoumarin derivatives were the most effective inhibitors. Different structural features of the other compounds determined CL-luc and/or CL-lum inhibition. The 2D-QSAR analysis suggested the importance of hydrophobic contributions to explain these effects. In addition, a statistically significant 3D-QSAR model built applying GRIND descriptors allowed us to propose a virtual receptor site considering pharmacophoric regions and mutual distances. Furthermore, the 3-phenylcoumarins studied were not toxic to neutrophils under the assessed conditions. (C) 2007 Elsevier Masson SAS. All rights reserved.
Resumo:
This paper addresses robust model-order reduction of a high dimensional nonlinear partial differential equation (PDE) model of a complex biological process. Based on a nonlinear, distributed parameter model of the same process which was validated against experimental data of an existing, pilot-scale BNR activated sludge plant, we developed a state-space model with 154 state variables in this work. A general algorithm for robustly reducing the nonlinear PDE model is presented and based on an investigation of five state-of-the-art model-order reduction techniques, we are able to reduce the original model to a model with only 30 states without incurring pronounced modelling errors. The Singular perturbation approximation balanced truncating technique is found to give the lowest modelling errors in low frequency ranges and hence is deemed most suitable for controller design and other real-time applications. (C) 2002 Elsevier Science Ltd. All rights reserved.
Resumo:
In order to understand the earthquake nucleation process, we need to understand the effective frictional behavior of faults with complex geometry and fault gouge zones. One important aspect of this is the interaction between the friction law governing the behavior of the fault on the microscopic level and the resulting macroscopic behavior of the fault zone. Numerical simulations offer a possibility to investigate the behavior of faults on many different scales and thus provide a means to gain insight into fault zone dynamics on scales which are not accessible to laboratory experiments. Numerical experiments have been performed to investigate the influence of the geometric configuration of faults with a rate- and state-dependent friction at the particle contacts on the effective frictional behavior of these faults. The numerical experiments are designed to be similar to laboratory experiments by DIETERICH and KILGORE (1994) in which a slide-hold-slide cycle was performed between two blocks of material and the resulting peak friction was plotted vs. holding time. Simulations with a flat fault without a fault gouge have been performed to verify the implementation. These have shown close agreement with comparable laboratory experiments. The simulations performed with a fault containing fault gouge have demonstrated a strong dependence of the critical slip distance D-c on the roughness of the fault surfaces and are in qualitative agreement with laboratory experiments.
Resumo:
Este artigo é uma introdução à teoria do paradigma desconstrutivo de aprendizagem cooperativa. Centenas de estudos provam com evidências o facto de que as estruturas e os processos de aprendizagem cooperativa aumentam o desempenho académico, reforçam as competências de aprendizagem ao longo da vida e desenvolvem competências sociais, pessoais de cada aluno de uma forma mais eficaz e usta, comparativamente às estruturas tradicionais de aprendizagem nas escolas. Enfrentando os desafios dos nossos sistemas educativos, seria interessante elaborar o quadro teórico do discurso da aprendizagem cooperativa, dos últimos 40 anos, a partir de um aspeto prático dentro do contexto teórico e metodológico. Nas últimas décadas, o discurso cooperativo elaborou os elementos práticos e teóricos de estruturas e processos de aprendizagem cooperativa. Gostaríamos de fazer um resumo desses elementos com o objetivo de compreender que tipo de mudanças estruturais podem fazer diferenças reais na prática de ensino e aprendizagem. Os princípios básicos de estruturas cooperativas, os papéis de cooperação e as atitudes cooperativas são os principais elementos que podemos brevemente descrever aqui, de modo a criar um quadro para a compreensão teórica e prática de como podemos sugerir os elementos de aprendizagem cooperativa na nossa prática em sala de aula. Na minha perspetiva, esta complexa teoria da aprendizagem cooperativa pode ser entendida como um paradigma desconstrutivo que fornece algumas respostas pragmáticas para as questões da nossa prática educativa quotidiana, a partir do nível da sala de aula para o nível de sistema educativo, com foco na destruição de estruturas hierárquicas e antidemocráticas de aprendizagem e, criando, ao mesmo tempo, as estruturas cooperativas.
Resumo:
Theory building is one of the most crucial challenges faced by basic, clinical and population research, which form the scientific foundations of health practices in contemporary societies. The objective of the study is to propose a Unified Theory of Health-Disease as a conceptual tool for modeling health-disease-care in the light of complexity approaches. With this aim, the epistemological basis of theoretical work in the health field and concepts related to complexity theory as concerned to health problems are discussed. Secondly, the concepts of model-object, multi-planes of occurrence, modes of health and disease-illness-sickness complex are introduced and integrated into a unified theoretical framework. Finally, in the light of recent epistemological developments, the concept of Health-Disease-Care Integrals is updated as a complex reference object fit for modeling health-related processes and phenomena.