936 resultados para Target Field Method
Resumo:
In the course of this study, stiffness of a fibril array of mineralized collagen fibrils modeled with a mean field method was validated experimentally at site-matched two levels of tissue hierarchy using mineralized turkey leg tendons (MTLT). The applied modeling approaches allowed to model the properties of this unidirectional tissue from nanoscale (mineralized collagen fibrils) to macroscale (mineralized tendon). At the microlevel, the indentation moduli obtained with a mean field homogenization scheme were compared to the experimental ones obtained with microindentation. At the macrolevel, the macroscopic stiffness predicted with micro finite element (μFE) models was compared to the experimental stiffness measured with uniaxial tensile tests. Elastic properties of the elements in μFE models were injected from the mean field model or two-directional microindentations. Quantitatively, the indentation moduli can be properly predicted with the mean-field models. Local stiffness trends within specific tissue morphologies are very weak, suggesting additional factors responsible for the stiffness variations. At macrolevel, the μFE models underestimate the macroscopic stiffness, as compared to tensile tests, but the correlations are strong.
Resumo:
A major problem in modern probabilistic modeling is the huge computational complexity involved in typical calculations with multivariate probability distributions when the number of random variables is large. Because exact computations are infeasible in such cases and Monte Carlo sampling techniques may reach their limits, there is a need for methods that allow for efficient approximate computations. One of the simplest approximations is based on the mean field method, which has a long history in statistical physics. The method is widely used, particularly in the growing field of graphical models. Researchers from disciplines such as statistical physics, computer science, and mathematical statistics are studying ways to improve this and related methods and are exploring novel application areas. Leading approaches include the variational approach, which goes beyond factorizable distributions to achieve systematic improvements; the TAP (Thouless-Anderson-Palmer) approach, which incorporates correlations by including effective reaction terms in the mean field theory; and the more general methods of graphical models. Bringing together ideas and techniques from these diverse disciplines, this book covers the theoretical foundations of advanced mean field methods, explores the relation between the different approaches, examines the quality of the approximation obtained, and demonstrates their application to various areas of probabilistic modeling.
Resumo:
The catastrophic event of red tide has happened in the Strait of Hormuz, the Persian Gulf and Gulf of Oman from late summer 2008 to spring 2009. With its devastating effects, the phenomenon shocked all the countries located in the margin of the Persian Gulf and the Gulf of Oman and caused considerable losses to fishery industries, tourism, and tourist and trade economy of the region. In the maritime cruise carried out by the Persian Gulf and Gulf of Oman Ecological Research Institute, field data, including temperature, salinity, chlorophyll-a, dissolved oxygen and algal density were obtained for this research. Satellite information was received from MODIS and MERIS and SeaWiFS sensors. Temperature and surface chlorophyll images were obtained and compared with the field data and data of PROBE model. The results obtained from the present research indicated that with the occurrence of harmful algal blooms (HAB), the Chlorophyll-a and the dissolved oxygen contents increased in the surface water. Maximum algal density was seen in the northern coasts of the Strait of Hormuz. Less concentration of algal density was detected in deep and surface offshore water. Our results show that the occurred algal bloom was the result of seawater temperature drop, water circulation and the adverse environmental pollutions caused by industrial and urban sewages entering the coastal waters in this region of the Persian Gulf ,This red tide phenomenon was started in the Strait of Hormuz and eventually covered about 140,000 km2 of the Persian Gulf and total area of Strait of Hormuz and it survived for 10 months which is a record amongst the occurred algal blooms across the world. Temperature and chlorophyll satellite images were proportionate to the measured values obtained by the field method. This indicates that satellite measurements have acceptable precisions and they can be used in sea monitoring and modeling.
Resumo:
The solvent effects on the low-lying absorption spectrum and on the (15)N chemical shielding of pyrimidine in water are calculated using the combined and sequential Monte Carlo simulation and quantum mechanical calculations. Special attention is devoted to the solute polarization. This is included by an iterative procedure previously developed where the solute is electrostatically equilibrated with the solvent. In addition, we verify the simple yet unexplored alternative of combining the polarizable continuum model (PCM) and the hybrid QM/MM method. We use PCM to obtain the average solute polarization and include this in the MM part of the sequential QM/MM methodology, PCM-MM/QM. These procedures are compared and further used in the discrete and the explicit solvent models. The use of the PCM polarization implemented in the MM part seems to generate a very good description of the average solute polarization leading to very good results for the n-pi* excitation energy and the (15)N nuclear chemical shield of pyrimidine in aqueous environment. The best results obtained here using the solute pyrimidine surrounded by 28 explicit water molecules embedded in the electrostatic field of the remaining 472 molecules give the statistically converged values for the low lying n-pi* absorption transition in water of 36 900 +/- 100 (PCM polarization) and 36 950 +/- 100 cm(-1) (iterative polarization), in excellent agreement among one another and with the experimental value observed with a band maximum at 36 900 cm(-1). For the nuclear shielding (15)N the corresponding gas-water chemical shift obtained using the solute pyrimidine surrounded by 9 explicit water molecules embedded in the electrostatic field of the remaining 491 molecules give the statistically converged values of 24.4 +/- 0.8 and 28.5 +/- 0.8 ppm, compared with the inferred experimental value of 19 +/- 2 ppm. Considering the simplicity of the PCM over the iterative polarization this is an important aspect and the computational savings point to the possibility of dealing with larger solute molecules. This PCM-MM/QM approach reconciles the simplicity of the PCM model with the reliability of the combined QM/MM approaches.
Resumo:
Considering that the importance of cancer/testis (CT) antigens in multiple myeloma (MM) biology is still under investigation, the present study aimed to: (1) identify genes differentially expressed in MM using microarray analysis of plasma cell samples, separated according to the number of expressed CTs; (2) examine possible pathways related to MM pathogenesis; (3) validate the expression of candidate genes by quantitative real-time PCR (RQ-PCR). Three samples predominantly positive (>6 expressed), including the U266 cell line, and three samples predominantly negative (0 or 1 expressed CT for the 13 analyzed CT antigens), were submitted for microarray analysis. Validation by RQ-PCR from 24 MM samples showed that the ITGAS gene was downregulated in predominantly positive (>6 expressed CTs, p = 0.0030) and in tumor versus normal plasma cells (p = 0.0182). The RhoD gene was overexpressed in tumor plasma cells when compared to normal plasma cells (p = 0.0339). Results of the microarray analysis corroborate the hypothesis that MM could be separated into predominantly positive and predominantly negative expression. The differential expression of ITGA5 and RhoD suggests disruption of the focal adhesion pathway in MM and offers a new target field to be explored in this disease.
Resumo:
The usual high cost of commercial codes, and some technical limitations, clearly limits the employment of numerical modelling tools in both industry and academia. Consequently, the number of companies that use numerical code is limited and there a lot of effort put on the development and maintenance of in-house academic based codes. Having in mind the potential of using numerical modelling tools as a design aid, of both products and processes, different research teams have been contributing to the development of open source codes/libraries. In this framework, any individual can take advantage of the available code capabilities and/or implement additional features based on his specific needs. These type of codes are usually developed by large communities, which provide improvements and new features in their specific fields of research, thus increasing significantly the code development process. Among others, OpenFOAM® multi-physics computational library, developed by a very large and dynamic community, nowadays comprises several features usually only available in their commercial counterparts; e.g. dynamic meshes, large diversity of complex physical models, parallelization, multiphase models, to name just a few. This computational library is developed in C++ and makes use of most of all language capabilities to facilitate the implementation of new functionalities. Concerning the field of computational rheology, OpenFOAM® solvers were recently developed to deal with the most relevant differential viscoelastic rheological models, and stabilization techniques are currently being verified. This work describes the implementation of a new solver in OpenFOAM® library, able to cope with integral viscoelastic models based on the deformation field method. The implemented solver is verified through the comparison of the predicted results with analytical solutions, results published in the literature and by using the Method of Manufactured Solutions.
Resumo:
The usual high cost of commercial codes, and some technical limitations, clearly limits the employment of numerical modelling tools in both industry and academia. Consequently, the number of companies that use numerical code is limited and there a lot of effort put on the development and maintenance of in-house academic based codes . Having in mind the potential of using numerical modelling tools as a design aid, of both products and processes, different research teams have been contributing to the development of open source codes/libraries. In this framework, any individual can take advantage of the available code capabilities and/or implement additional features based on his specific needs. These type of codes are usually developed by large communities, which provide improvements and new features in their specific fields of research, thus increasing significantly the code development process. Among others, OpenFOAM® multi-physics computational library, developed by a very large and dynamic community, nowadays comprises several features usually only available in their commercial counterparts; e.g. dynamic meshes, large diversity of complex physical models, parallelization, multiphase models, to name just a few. This computational library is developed in C++ and makes use of most of all language capabilities to facilitate the implementation of new functionalities. Concerning the field of computational rheology, OpenFOAM® solvers were recently developed to deal with the most relevant differential viscoelastic rheological models, and stabilization techniques are currently being verified. This work describes the implementation of a new solver in OpenFOAM® library, able to cope with integral viscoelastic models based on the deformation field method. The implemented solver is verified through the comparison of the predicted results with analytical solutions, results published in the literature and by using the Method of Manufactured Solutions
Resumo:
A new practical method to generate a subspace of active coordinates for quantum dynamics calculations is presented. These reduced coordinates are obtained as the normal modes of an analytical quadratic representation of the energy difference between excited and ground states within the complete active space self-consistent field method. At the Franck-Condon point, the largest negative eigenvalues of this Hessian correspond to the photoactive modes: those that reduce the energy difference and lead to the conical intersection; eigenvalues close to 0 correspond to bath modes, while modes with large positive eigenvalues are photoinactive vibrations, which increase the energy difference. The efficacy of quantum dynamics run in the subspace of the photoactive modes is illustrated with the photochemistry of benzene, where theoretical simulations are designed to assist optimal control experiments
Resumo:
Tässä diplomityössä kehitetään Loviisan voimalaitoksen todennäköisyyspohjaisen paloriskianalyysin kaapelitietokantaa tulevaisuuden haasteita varten. Tietokannan kehittämistä varten tutustutaan todennäköisyyspohjaiseen riskianalyysiin varsin-kin paloriskianalyysin osalta. Käytännönläheisempää kehittämistä varten tutustu-taan voimalaitoksella nykyisin käytössä oleviin kaapelitietokantoihin: paloriski-tutkimusta varten laadittuun PSA-ELTIEen, kunnossapidon tiedonhallintajärjes-telmä LOMAXiin, sähkö- ja automaatiosuunnitteluyksikköjen arkistoihin sekä automaatiouudistuksen tietokantaan. Tietokannan käytännönläheisempien ominai-suuksien selvittämiseksi voimalaitoksella kokeiltiin kenttätarkastusmenetelmää, joka on ensisijainen kaapelikartoitusmenetelmä. Tietokantoihin tutustumisen perusteella vaihtoehtoisiksi tulevaisuuden tietokan-noiksi mietittiin LOMAXia, PSA-ELTIEtä tai uutta tietokantaa. Tulevaisuuden tietokantavaihtoehdoksi on päädytty ehdottamaan LOMAXia, joka vaatii vähem-män muutoksia muihin vaihtoehtoihin nähden. Tällainen laajalti käytössä oleva yhteinen tietokanta mahdollistaa sen, että tiedot ovat helpommin ja varmemmin kaikkien niitä tarvitsevien käytettävissä ja asiantuntijoiden muokattavissa, millä myös varmistetaan tietojen oikeellisuutta ja pysymistä ajan tasalla. Tulevaan LOMAX päivitykseen on ehdotettu tarpeellisia tietokenttien lisäyksiä ja kaapeli-hierarkian parantamista kaapelitietokannaksi käyttöönottamista varten.
Resumo:
Dans cette étude, nous analysons les principaux problèmes que nous retrouvons lorsque nous utilisons les entrevues sociolinguistiques comme méthode d’analyse pour obtenir des échantillons de l’actuation linguistique authentiques. Cette problématique provient de la nature même de la méthodologie employée en fonction du paradoxe de l’observateur (Labov 1972) et elle impose la nécessité de réfléchir sur les avantages et les désavantages inhérents à l’instrument de recherche utilisé. En ayant ce propos, l’objectif principal qu’on poursuit est celui de donner réponse à une question que surgit lorsqu’on parle de l’entrevue sociolinguistique : Comment pourrait-on obtenir des échantillons de parole de style spontané dans l’observation systématique de l’entrevue? Pour essayer de répondre à cette problématique, on a revu et analysé un échantillon de vingt entrevues semi dirigées (25 heures d’enregistrement) qui font partie du Corpus Oral et Sonore de l’Espagnol Rural (COSER). L’étude des entrevues sociolinguistiques comme méthode scientifique montre, comme principal résultat, que parmi les stratégies utilisées pour essayer de réduire les effets du paradoxe de l’observateur on devrait inclure celle de la tactique de faire parler l’informateur à propos d’un objet qui ait une particulière valeur affective pour lui, pour générer ainsi une déstructuration du schéma formel de l’entrevue et une situation émotionnelle de sorte que l’émotivité neutralise sa conscience linguistique et son discours. De cette façon, l’attention du parlant se concentrera dans l’objet même plutôt que dans sa parole formelle et, de cette manière, on obtiendrait des échantillons de style plus spontané et colloquial.
Resumo:
A new practical method to generate a subspace of active coordinates for quantum dynamics calculations is presented. These reduced coordinates are obtained as the normal modes of an analytical quadratic representation of the energy difference between excited and ground states within the complete active space self-consistent field method. At the Franck-Condon point, the largest negative eigenvalues of this Hessian correspond to the photoactive modes: those that reduce the energy difference and lead to the conical intersection; eigenvalues close to 0 correspond to bath modes, while modes with large positive eigenvalues are photoinactive vibrations, which increase the energy difference. The efficacy of quantum dynamics run in the subspace of the photoactive modes is illustrated with the photochemistry of benzene, where theoretical simulations are designed to assist optimal control experiments
Resumo:
The rovibration partition function of CH4 was calculated in the temperature range of 100-1000 K using well-converged energy levels that were calculated by vibrational-rotational configuration interaction using the Watson Hamiltonian for total angular momenta J=0-50 and the MULTIMODE computer program. The configuration state functions are products of ground-state occupied and virtual modals obtained using the vibrational self-consistent field method. The Gilbert and Jordan potential energy surface was used for the calculations. The resulting partition function was used to test the harmonic oscillator approximation and the separable-rotation approximation. The harmonic oscillator, rigid-rotator approximation is in error by a factor of 2.3 at 300 K, but we also propose a separable-rotation approximation that is accurate within 2% from 100 to 1000 K. (C) 2004 American Institute of Physics.
Resumo:
We studied superclusters of galaxies in a volume-limited sample extracted from the Sloan Digital Sky Survey Data Release 7 and from mock catalogues based on a semi-analytical model of galaxy evolution in the Millennium Simulation. A density field method was applied to a sample of galaxies brighter than M(r) = -21+5 log h(100) to identify superclusters, taking into account selection and boundary effects. In order to evaluate the influence of the threshold density, we have chosen two thresholds: the first maximizes the number of objects (D1) and the second constrains the maximum supercluster size to similar to 120 h(-1) Mpc (D2). We have performed a morphological analysis, using Minkowski Functionals, based on a parameter, which increases monotonically from filaments to pancakes. An anticorrelation was found between supercluster richness (and total luminosity or size) and the morphological parameter, indicating that filamentary structures tend to be richer, larger and more luminous than pancakes in both observed and mock catalogues. We have also used the mock samples to compare supercluster morphologies identified in position and velocity spaces, concluding that our morphological classification is not biased by the peculiar velocities. Monte Carlo simulations designed to investigate the reliability of our results with respect to random fluctuations show that these results are robust. Our analysis indicates that filaments and pancakes present different luminosity and size distributions.
Resumo:
The general statement that birds are recorded more often on morning than on afternoon counts is quite common and widespread among ornithologists. Although many investigators have reported temporal variations in bird detections using Point Counts in temperate regions, few researches regarding the same objectives have been conducted in Neotropical habitats or used transect counts as field method. We used transect counts to test the hypothesis that birds are evenly recorded between times of day in a predominantly open Cerrado landscape in southeastern Brazil. Although not always significantly, the number of species and individuals were consistently greater during the morning counts, which corroborates the fact that birds can be more detectable during this time of day. However, a few families as well as a small percentage of species were more likely to be recorded during either one of the two periods we analyzed. Our results suggest that morning counts should detect higher number of both species and individuals in our study area, but specific taxa show distinct patterns of detection which should be acknowledged prior to sampling.
Resumo:
Crystallographic screening has been used to identify new inhibitors for potential target for drug development. Here, we describe the application of the crystallographic screening to assess the structural basis of specificity of ligands against a protein target. The method is efficient and results in detailed crystallographic information. The utility of the method is demonstrated in the study of the structural basis for specificity of ligands for human purine nucleoside phosphorylase (PNP). Purine nucleoside phosphorylase catalyzes the phosphorolysis of the N-ribosidic bonds of purine nucleosides and deoxynucleosides. This enzyme is a target for inhibitor development aiming at T-cell immune response modulation and has been submitted to extensive structure-based drug design. This methodology may help in the future development of a new generation of PNP inhibitors.