954 resultados para Ligand-steered Modeling Method


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dissertação apresentada como requisito parcial para obtenção do grau de Mestre em Ciência e Sistemas de Informação Geográfica

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Résumé La réalisation d'une seconde ligne de métro (M2) dès 2004, passant dans le centre ville de Lausanne, a été l'opportunité de développer une méthodologie concernant des campagnes microgravimétriques dans un environnement urbain perturbé. Les corrections topographiques prennent une dimension particulière dans un tel milieu, car de nombreux objets non géologiques d'origine anthropogénique comme toutes sortes de sous-sols vides viennent perturber les mesures gravimétriques. Les études de génie civil d'avant projet de ce métro nous ont fournis une quantité importante d'informations cadastrales, notamment sur les contours des bâtiments, sur la position prévue du tube du M2, sur des profondeurs de sous-sol au voisinage du tube, mais aussi sur la géologie rencontré le long du corridor du M2 (issue des données lithologiques de forages géotechniques). La planimétrie des sous-sols a été traitée à l'aide des contours des bâtiments dans un SIG (Système d'Information Géographique), alors qu'une enquête de voisinage fut nécessaire pour mesurer la hauteur des sous-sols. Il a été alors possible, à partir d'un MNT (Modèle Numérique de Terrain) existant sur une grille au mètre, de mettre à jour celui ci avec les vides que représentent ces sous-sols. Les cycles de mesures gravimétriques ont été traités dans des bases de données Ac¬cess, pour permettre un plus grand contrôle des données, une plus grande rapidité de traitement, et une correction de relief rétroactive plus facile, notamment lorsque des mises à jour de la topographie ont lieu durant les travaux. Le quartier Caroline (entre le pont Bessières et la place de l'Ours) a été choisi comme zone d'étude. Le choix s'est porté sur ce quartier du fait que, durant ce travail de thèse, nous avions chronologiquement les phases pré et post creusement du tunnel du M2. Cela nous a permis d'effectuer deux campagnes gravimétriques (avant le creu¬sement durant l'été 2005 et après le creusement durant l'été 2007). Ces réitérations nous ont permis de tester notre modélisation du tunnel. En effet, en comparant les mesures des deux campagnes et la réponse gravifique du modèle du tube discrétisé en prismes rectangulaires, nous avons pu valider notre méthode de modélisation. La modélisation que nous avons développée nous permet de construire avec détail la forme de l'objet considéré avec la possibilité de recouper plusieurs fois des interfaces de terrains géologiques et la surface topographique. Ce type de modélisation peut s'appliquer à toutes constructions anthropogéniques de formes linéaires. Abstract The realization of a second underground (M2) in 2004, in downtown Lausanne, was the opportunity to develop a methodology of microgravity in urban environment. Terrain corrections take on special meaning in such environment. Many non-geologic anthropogenic objects like basements act as perturbation of gravity measurements. Civil engineering provided a large amount of cadastral informations, including out¬lines of buildings, M2 tube position, depths of some basements in the vicinity of the M2 corridor, and also on the geology encountered along the M2 corridor (from the lithological data from boreholes). Geometry of basements was deduced from building outlines in a GIS (Geographic Information System). Field investigation was carried out to measure or estimate heights of basements. A DEM (Digital Elevation Model) of the city of Lausanne is updated from voids of basements. Gravity cycles have been processed in Access database, to enable greater control of data, enhance speed processing, and retroactive terrain correction easier, when update of topographic surface are available. Caroline area (between the bridge Saint-Martin and Place de l'Ours) was chosen as the study area. This area was in particular interest because it was before and after digging in this thesis. This allowed us to conduct two gravity surveys (before excavation during summer 2005 and after excavation during summer 2007). These re-occupations enable us to test our modélisation of the tube. Actually, by comparing the difference of measurements between the both surveys and the gravity response of our model (by rectangular prisms), we were able to validate our modeling. The modeling method we developed allows us to construct detailed shape of an object with possibility to cross land geological interfaces and surface topography. This type of modélisation can be applied to all anthropogenic structures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Työn tarkoituksena oli selvittää kaupallisen dynamiikan simulointiohjelmiston (Adams) soveltuvuus siltanosturin mallintamiseen. Työn kohteena oli kaksipalkkinen siltanosturi, joka sijaitsi KCI:n tiloissa Hyvinkäällä. Nosturin jänneväli oli noin 19.5 metriä ja nostokyky 16 tonnia. Mallintamisessa keskityttiin nosturin dynamiikkaan sekä ohjausvoimiin nosturin kantopyörissä. Simulointitulokset verifioitiin mittauksin. Koska mallista haluttiin mahdollisimman yksinkertainen, mallinnettiin ainoastaan pääkannattajat ja köydet joustavina. Muut osat mallinnettiin jäykkinä. Yksinkertaisuuteen pyrittiin sen vuoksi, että mallia oli tarkoitus käyttää perustana komponenttikirjaston luomiseksi myöhempää käyttöä varten. Tuloksista todettiin mallin soveltuvan hyvin nosturin dynamiikan mallintamiseen. Mallista saatavat tulokset vastasivat hyvin mitattuja liikkeitä. Ohjausvoimia ei kuitenkaan saatu verifioitua. Käytetty mittausmenetelmä osoittautui sopimattomaksi.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Modeling methods to derive 3D-structure of proteins have been recently developed. Protein homology-modeling, also known as comparative protein modeling, is nowadays the most accurate protein modeling method. This technique can produce useful models for about an order of magnitude more protein sequences than there have been structures determined by experiment in the same amount of time. All current protein homology-modeling methods consist of four sequential steps: fold assignment and template selection, template-target alignment, model building, and model evaluation. In this paper we discuss in some detail the protein-homology paradigm, its predictive power and its limitations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The drug discovery process is facing new challenges in the evaluation process of the lead compounds as the number of new compounds synthesized is increasing. The potentiality of test compounds is most frequently assayed through the binding of the test compound to the target molecule or receptor, or measuring functional secondary effects caused by the test compound in the target model cells, tissues or organism. Modern homogeneous high-throughput-screening (HTS) assays for purified estrogen receptors (ER) utilize various luminescence based detection methods. Fluorescence polarization (FP) is a standard method for ER ligand binding assay. It was used to demonstrate the performance of two-photon excitation of fluorescence (TPFE) vs. the conventional one-photon excitation method. As result, the TPFE method showed improved dynamics and was found to be comparable with the conventional method. It also held potential for efficient miniaturization. Other luminescence based ER assays utilize energy transfer from a long-lifetime luminescent label e.g. lanthanide chelates (Eu, Tb) to a prompt luminescent label, the signal being read in a time-resolved mode. As an alternative to this method, a new single-label (Eu) time-resolved detection method was developed, based on the quenching of the label by a soluble quencher molecule when displaced from the receptor to the solution phase by an unlabeled competing ligand. The new method was paralleled with the standard FP method. It was shown to yield comparable results with the FP method and found to hold a significantly higher signal-tobackground ratio than FP. Cell-based functional assays for determining the extent of cell surface adhesion molecule (CAM) expression combined with microscopy analysis of the target molecules would provide improved information content, compared to an expression level assay alone. In this work, immune response was simulated by exposing endothelial cells to cytokine stimulation and the resulting increase in the level of adhesion molecule expression was analyzed on fixed cells by means of immunocytochemistry utilizing specific long-lifetime luminophore labeled antibodies against chosen adhesion molecules. Results showed that the method was capable of use in amulti-parametric assay for protein expression levels of several CAMs simultaneously, combined with analysis of the cellular localization of the chosen adhesion molecules through time-resolved luminescence microscopy inspection.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Diplomityön tavoitteena oli kehittää tuotteistettu palvelu toimintolaskentajärjestelmän käyttöönottoon. Käyttöönotolla tarkoitettiin tässä työssä laskennan käyttöönottoa, eli laskentamallin rakentamista toimintolaskentajärjestelmään. Työ tehtiin Solenovo Oy:n toimeksiannosta ja tutkimusmenetelmänä käytettiin toimintatutkimusta. Työ koostuu teoreettisesta viitekehyksestä ja empiirisestä osuudesta. Työn teoreettinen viitekehys muodostuu kolmesta eri aihepiiriä käsittelevästä luvusta. Ensimmäinen käsittelee toimintolaskentaa ja erityisesti sen vahvuuksia, heikkouksia, käyttöönottoa, sekä eroja ja yhtäläisyyksiä verrattuna perinteiseen kustannuslaskentaan. Seuraavana käsitellään palveluita ja niiden kehittämistä. Palvelun kehittämismenetelmistä käsitellään tähän työhön valittu menetelmä service blueprint, sen rakenne ja kehittäminen. Kolmas teoreettisen viitekehyksen osa-alue on palveluiden tuotteistaminen, mikä tarkentui tässä työssä erityisesti asiantuntijapalveluiden tuotteistamiseen. Työn päätuloksena saavutettiin työn tärkein tavoite, eli kehitettiin tuotteistettu palvelu toimintolaskentajärjestelmän käyttöönottoon. Palvelu kehitettiin service blueprint-menetelmää hyödyntäen ja tuotteistettiin soveltuvilta osin. Koska kyseessä oli täysin uusi palvelu, tuotteistaminen painottui tuotteistamisen suunnitteluun ja sisäiseen tuotteistamiseen. Työn aikana määriteltiin palvelun vaatimukset ja sisältö modulointia hyödyntäen, ohjeistus palvelun toteutusta varten, arvioitiin palvelun eri vaiheisiin sisältyviä riskejä ja haasteita, sekä määriteltiin palvelun jatkokehitystarpeet.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

After skin cancer, breast cancer accounts for the second greatest number of cancer diagnoses in women. Currently the etiologies of breast cancer are unknown, and there is no generally accepted therapy for preventing it. Therefore, the best way to improve the prognosis for breast cancer is early detection and treatment. Computer aided detection systems (CAD) for detecting masses or micro-calcifications in mammograms have already been used and proven to be a potentially powerful tool , so the radiologists are attracted by the effectiveness of clinical application of CAD systems. Fractal geometry is well suited for describing the complex physiological structures that defy the traditional Euclidean geometry, which is based on smooth shapes. The major contribution of this research include the development of • A new fractal feature to accurately classify mammograms into normal and normal (i)With masses (benign or malignant) (ii) with microcalcifications (benign or malignant) • A novel fast fractal modeling method to identify the presence of microcalcifications by fractal modeling of mammograms and then subtracting the modeled image from the original mammogram. The performances of these methods were evaluated using different standard statistical analysis methods. The results obtained indicate that the developed methods are highly beneficial for assisting radiologists in making diagnostic decisions. The mammograms for the study were obtained from the two online databases namely, MIAS (Mammographic Image Analysis Society) and DDSM (Digital Database for Screening Mammography.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Scene classification based on latent Dirichlet allocation (LDA) is a more general modeling method known as a bag of visual words, in which the construction of a visual vocabulary is a crucial quantization process to ensure success of the classification. A framework is developed using the following new aspects: Gaussian mixture clustering for the quantization process, the use of an integrated visual vocabulary (IVV), which is built as the union of all centroids obtained from the separate quantization process of each class, and the usage of some features, including edge orientation histogram, CIELab color moments, and gray-level co-occurrence matrix (GLCM). The experiments are conducted on IKONOS images with six semantic classes (tree, grassland, residential, commercial/industrial, road, and water). The results show that the use of an IVV increases the overall accuracy (OA) by 11 to 12% and 6% when it is implemented on the selected and all features, respectively. The selected features of CIELab color moments and GLCM provide a better OA than the implementation over CIELab color moment or GLCM as individuals. The latter increases the OA by only ∼2 to 3%. Moreover, the results show that the OA of LDA outperforms the OA of C4.5 and naive Bayes tree by ∼20%. © 2014 Society of Photo-Optical Instrumentation Engineers (SPIE) [DOI: 10.1117/1.JRS.8.083690]

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As a new modeling method, support vector regression (SVR) has been regarded as the state-of-the-art technique for regression and approximation. In this study, the SVR models had been introduced and developed to predict body and carcass-related characteristics of 2 strains of broiler chicken. To evaluate the prediction ability of SVR models, we compared their performance with that of neural network (NN) models. Evaluation of the prediction accuracy of models was based on the R-2, MS error, and bias. The variables of interest as model output were BW, empty BW, carcass, breast, drumstick, thigh, and wing weight in 2 strains of Ross and Cobb chickens based on intake dietary nutrients, including ME (kcal/bird per week), CP, TSAA, and Lys, all as grams per bird per week. A data set composed of 64 measurements taken from each strain were used for this analysis, where 44 data lines were used for model training, whereas the remaining 20 lines were used to test the created models. The results of this study revealed that it is possible to satisfactorily estimate the BW and carcass parts of the broiler chickens via their dietary nutrient intake. Through statistical criteria used to evaluate the performance of the SVR and NN models, the overall results demonstrate that the discussed models can be effective for accurate prediction of the body and carcass-related characteristics investigated here. However, the SVR method achieved better accuracy and generalization than the NN method. This indicates that the new data mining technique (SVR model) can be used as an alternative modeling tool for NN models. However, further reevaluation of this algorithm in the future is suggested.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Indigo carmine forms a stable complex with different ions, and the stability constant of the complexes were evaluated as log K equal to 5.75; 5.00; 4.89 and 3.89 for complexes with Cu(II), Ni(II), Co(II) and Zn(II) ions, respectively, in 0.1 mol L -1 carbonate buffer solution at pH 10. The interaction between Cu(II) ions and indigo carmine (IC) in alkaline medium resulted in the formation of the Cu 2(IC) complex, measured by the spectrophotometric method, with a stoichiometric ratio between indigo carmine and metal ions of 2:1 (metal-ligand). The reported method has also been successfully tested for determination of copper in pharmaceutical compounds based on copper-gluconate without pre-treatment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The continuous advance of the Brazilian economy and increased competition in the heavy equipment market, increasingly point to the need for accurate sales forecasting processes, which allow an optimized strategic planning and therefore better overall results. In this manner, we found that the sales forecasting process deserves to be studied and understood, since it has a key role in corporate strategic planning. Accurate forecasting methods enable direction of companies to circumvent the management difficulties and the variations of finished goods inventory, which make companies more competitive. By analyzing the stages of the sales forecasting it was possible to observe that this process is methodical, bureaucratic and demands a lot of training for their managers and professionals. In this paper we applied the modeling method and the selecting process which has been done for Armstrong to select the most appropriate technique for two products of a heavy equipment industry and it has been through this method that the triple exponential smoothing technique has been chosen for both products. The results obtained by prediction with the triple exponential smoothing technique were better than forecasts prepared by the industry experts

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The transmission system is responsible for connecting the power generators to consumers safely and reliably, its constant expansion is necessary to transport increasing amounts of electricity. In order to help the power systems engineers, an optimization tool for optimize the expansion of the transmission system was developed using the modeling method of the linearized load flow and genetic. This tool was designed to simulate the impact of different scenarios on the cost of transmission expansion. The proposed tool was used to simulate the effects of the presence of distributed generation in the expansion of a fictitious transmission system, where it was found a clear downward trend in investment required for the expansion of the transmission system taking account of increasing levels of distributed generation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

ABSTRACT (italiano) Con crescente attenzione riguardo al problema della sicurezza di ponti e viadotti esistenti nei Paesi Bassi, lo scopo della presente tesi è quello di studiare, mediante la modellazione con Elementi Finiti ed il continuo confronto con risultati sperimentali, la risposta in esercizio di elementi che compongono infrastrutture del genere, ovvero lastre in calcestruzzo armato sollecitate da carichi concentrati. Tali elementi sono caratterizzati da un comportamento ed una crisi per taglio, la cui modellazione è, da un punto di vista computazionale, una sfida piuttosto ardua, a causa del loro comportamento fragile combinato a vari effetti tridimensionali. La tesi è incentrata sull'utilizzo della Sequentially Linear Analysis (SLA), un metodo di soluzione agli Elementi Finiti alternativo rispetto ai classici approcci incrementali e iterativi. Il vantaggio della SLA è quello di evitare i ben noti problemi di convergenza tipici delle analisi non lineari, specificando direttamente l'incremento di danno sull'elemento finito, attraverso la riduzione di rigidezze e resistenze nel particolare elemento finito, invece dell'incremento di carico o di spostamento. Il confronto tra i risultati di due prove di laboratorio su lastre in calcestruzzo armato e quelli della SLA ha dimostrato in entrambi i casi la robustezza del metodo, in termini di accuratezza dei diagrammi carico-spostamento, di distribuzione di tensioni e deformazioni e di rappresentazione del quadro fessurativo e dei meccanismi di crisi per taglio. Diverse variazioni dei più importanti parametri del modello sono state eseguite, evidenziando la forte incidenza sulle soluzioni dell'energia di frattura e del modello scelto per la riduzione del modulo elastico trasversale. Infine è stato effettuato un paragone tra la SLA ed il metodo non lineare di Newton-Raphson, il quale mostra la maggiore affidabilità della SLA nella valutazione di carichi e spostamenti ultimi insieme ad una significativa riduzione dei tempi computazionali. ABSTRACT (english) With increasing attention to the assessment of safety in existing dutch bridges and viaducts, the aim of the present thesis is to study, through the Finite Element modeling method and the continuous comparison with experimental results, the real response of elements that compose these infrastructures, i.e. reinforced concrete slabs subjected to concentrated loads. These elements are characterized by shear behavior and crisis, whose modeling is, from a computational point of view, a hard challenge, due to their brittle behavior combined with various 3D effects. The thesis is focused on the use of Sequentially Linear Analysis (SLA), an alternative solution technique to classical non linear Finite Element analyses that are based on incremental and iterative approaches. The advantage of SLA is to avoid the well-known convergence problems of non linear analyses by directly specifying a damage increment, in terms of a reduction of stiffness and strength in the particular finite element, instead of a load or displacement increment. The comparison between the results of two laboratory tests on reinforced concrete slabs and those obtained by SLA has shown in both the cases the robustness of the method, in terms of accuracy of load-displacements diagrams, of the distribution of stress and strain and of the representation of the cracking pattern and of the shear failure mechanisms. Different variations of the most important parameters have been performed, pointing out the strong incidence on the solutions of the fracture energy and of the chosen shear retention model. At last a confrontation between SLA and the non linear Newton-Raphson method has been executed, showing the better reliability of the SLA in the evaluation of the ultimate loads and displacements, together with a significant reduction of computational times.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nature leads, we follow. But nanotechnologists are in hot pursuit, in designing controllable structures that can mimic naturally occurring and artificially synthesized materials on a common platform. The supramolecular chemistry concerns the investigation of nature principles to produce fascinating complexed and functional molecular assemblies, as well as the utilization of these principles to generate novel devices and materials, potentially useful for sensing, catalysis, transport and other applications in medical or engineering science. The work presented in this thesis is a compilation of different synthetic methods to achieve inorganic-organic hybrid nanomaterials. Silicatein, a protein enzyme, which acts both as a catalyst and template for the formation of silica needles in marine sponges, has been used for the biosynthesis of semiconductor metal oxides on surfaces. Silicatein was immobilized on gold (111) surfaces using alkane thiol, as well as on a novel self-assembly of NTA on top of a “cushion” of reactive ester polymer has been successfully employed to make functionalised surfaces. The immobilization of silicatein on surfaces was monitored by surface plasmon spectroscopy, atomic force microscopy and confocal laser scanning microscopy. Surface bound silicatein retains its biocatalytic activity, which was demonstrated by monitoring its hydrocatalytic activity to catalyse the synthesis of biosilica, biotitania, and biozirconia. The synthesis of semiconductor metal oxides was characterized using scanning electron microscopy. This hydrolytic biocatalyst is used to synthesize the gold nanoparticles. The gold nanoparticles are formed by reduction of tetrachloroaurate, AuCl4-, by the action of sulfhydryl groups hidden below the surface groups of the protein. The resulting gold nanoparticles which are stabilized by surface bound silicatein further aggregate to form Au nanocrystals. The shape of the nanocrystals obtained by using recombinant silicatein is controlled through chiral induction by the protein during the nucleation of the nanocrystals. As an extension of this work, TiO2 nanowires were functionalized using polymeric ligand which incorporates the nitrilotriacetic acid (NTA) linker in the back bone to immobilize His-tagged silicatein onto the TiO2 nanowires. The surface bound protein not only retains its original hydrolytic properties, but also acts as a reductant for AuCl4- in the synthesis of hybrid TiO2/silicatein/Au nanocomposites. Functionalized, monocrystalline rutile TiO2 nanorods were prepared from TiCl4 in aqueous solution in the presence of dopamine. The surface bound organic ligand controls the morphology as well as the crystallinity and the phase selection of TiO2. The surface amine groups can be tailored further with functional molecules such as dyes. As an example, this surface functionality is used for the covalent binding of a fluorescent dye,4-chloro-7- nitrobenzylurazene (NBD) to the TiO2 nanorods. The polymeric ligands have been used successfully for the in-situ and post-functionalization of TiO2 nanoparticles. Besides to chelating dopamine anchor group the multifunctional ligand system presented here incorporates a modifier molecule which allows the binding of functional molecules (here the dyes pyrene, NBD, and Texas Red) as well as additional entities which allow tailoring the solubility of inorganic nanocrystals in different solvents. A novel method for the surface functionalization of fullerene-type MoS2 nanoparticles and subsequently binding these nanoparticles onto TiO2 nanowires has been reported using polymeric ligands. The procedure involves the complexation of IF-MoS2 with a combination of Ni2+ via an umbrella-type nitrilotriacetic acid (NTA) and anchoring them to the sidewalls of TiO2 nanowires utilizing the hydroxyl groups of dopamine present in the main contents of polymeric ligand. A convenient method for the synthesis of Au/CdS nanocomposites has been presented, which were achieved through the novel method of thiol functionalization of gold colloids. The thermodynamically most stable phase of ZrO2 (cubic) has been obtained at much lower temperature (180°C). These nanoparticles are highly blue fluorescent, with a high surface area.