981 resultados para Computational algorithm


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Although approximately 50% of Down Syndrome (DS) patients have heart abnormalities, they exhibit an overprotection against cardiac abnormalities related with the connective tissue, for example a lower risk of coronary artery disease. A recent study reported a case of a person affected by DS who carried mutations in FBN1, the gene causative for a connective tissue disorder called Marfan Syndrome (MFS). The fact that the person did not have any cardiac alterations suggested compensation effects due to DS. This observation is supported by a previous DS meta-analysis at the molecular level where we have found an overall upregulation of FBN1 (which is usually downregulated in MFS). Additionally, that result was cross-validated with independent expression data from DS heart tissue. The aim of this work is to elucidate the role of FBN1 in DS and to establish a molecular link to MFS and MFS-related syndromes using a computational approach. To reach that, we conducted different analytical approaches over two DS studies (our previous meta-analysis and independent expression data from DS heart tissue) and revealed expression alterations in the FBN1 interaction network, in FBN1 co-expressed genes and FBN1-related pathways. After merging the significant results from different datasets with a Bayesian approach, we prioritized 85 genes that were able to distinguish control from DS cases. We further found evidence for several of these genes (47%), such as FBN1, DCN, and COL1A2, being dysregulated in MFS and MFS-related diseases. Consequently, we further encourage the scientific community to take into account FBN1 and its related network for the study of DS cardiovascular characteristics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

STUDY DESIGN:: Retrospective database- query to identify all anterior spinal approaches. OBJECTIVES:: To assess all patients with pharyngo-cutaneous fistulas after anterior cervical spine surgery. SUMMARY OF BACKGROUND DATA:: Patients treated in University of Heidelberg Spine Medical Center, Spinal Cord Injury Unit and Department of Otolaryngology (Germany), between 2005 and 2011 with the diagnosis of pharyngo-cutaneous fistulas. METHODS:: We conducted a retrospective study on 5 patients between 2005 and 2011 with PCF after ACSS, their therapy management and outcome according to radiologic data and patient charts. RESULTS:: Upon presentation 4 patients were paraplegic. 2 had PCF arising from one piriform sinus, two patients from the posterior pharyngeal wall and piriform sinus combined and one patient only from the posterior pharyngeal wall. 2 had previous unsuccessful surgical repair elsewhere and 1 had prior radiation therapy. In 3 patients speech and swallowing could be completely restored, 2 patients died. Both were paraplegic. The patients needed an average of 2-3 procedures for complete functional recovery consisting of primary closure with various vascularised regional flaps and refining laser procedures supplemented with negative pressure wound therapy where needed. CONCLUSION:: Based on our experience we are able to provide a treatment algorithm that indicates that chronic as opposed to acute fistulas require a primary surgical closure combined with a vascularised flap that should be accompanied by the immediate application of a negative pressure wound therapy. We also conclude that particularly in paraplegic patients suffering this complication the risk for a fatal outcome is substantial.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction New evidence from randomized controlled and etiology of fever studies, the availability of reliable RDT for malaria, and novel technologies call for revision of the IMCI strategy. We developed a new algorithm based on (i) a systematic review of published studies assessing the safety and appropriateness of RDT and antibiotic prescription, (ii) results from a clinical and microbiological investigation of febrile children aged <5 years, (iii) international expert IMCI opinions. The aim of this study was to assess the safety of the new algorithm among patients in urban and rural areas of Tanzania.Materials and Methods The design was a controlled noninferiority study. Enrolled children aged 2-59 months with any illness were managed either by a study clinician using the new Almanach algorithm (two intervention health facilities), or clinicians using standard practice, including RDT (two control HF). At day 7 and day 14, all patients were reassessed. Patients who were ill in between or not cured at day 14 were followed until recovery or death. Primary outcome was rate of complications, secondary outcome rate of antibiotic prescriptions.Results 1062 children were recruited. Main diagnoses were URTI 26%, pneumonia 19% and gastroenteritis (9.4%). 98% (531/541) were cured at D14 in the Almanach arm and 99.6% (519/521) in controls. Rate of secondary hospitalization was 0.2% in each. One death occurred in controls. None of the complications was due to withdrawal of antibiotics or antimalarials at day 0. Rate of antibiotic use was 19% in the Almanach arm and 84% in controls.Conclusion Evidence suggests that the new algorithm, primarily aimed at the rational use of drugs, is as safe as standard practice and leads to a drastic reduction of antibiotic use. The Almanach is currently being tested for clinician adherence to proposed procedures when used on paper or a mobile phone

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Genetic variants influence the risk to develop certain diseases or give rise to differences in drug response. Recent progresses in cost-effective, high-throughput genome-wide techniques, such as microarrays measuring Single Nucleotide Polymorphisms (SNPs), have facilitated genotyping of large clinical and population cohorts. Combining the massive genotypic data with measurements of phenotypic traits allows for the determination of genetic differences that explain, at least in part, the phenotypic variations within a population. So far, models combining the most significant variants can only explain a small fraction of the variance, indicating the limitations of current models. In particular, researchers have only begun to address the possibility of interactions between genotypes and the environment. Elucidating the contributions of such interactions is a difficult task because of the large number of genetic as well as possible environmental factors.In this thesis, I worked on several projects within this context. My first and main project was the identification of possible SNP-environment interactions, where the phenotypes were serum lipid levels of patients from the Swiss HIV Cohort Study (SHCS) treated with antiretroviral therapy. Here the genotypes consisted of a limited set of SNPs in candidate genes relevant for lipid transport and metabolism. The environmental variables were the specific combinations of drugs given to each patient over the treatment period. My work explored bioinformatic and statistical approaches to relate patients' lipid responses to these SNPs, drugs and, importantly, their interactions. The goal of this project was to improve our understanding and to explore the possibility of predicting dyslipidemia, a well-known adverse drug reaction of antiretroviral therapy. Specifically, I quantified how much of the variance in lipid profiles could be explained by the host genetic variants, the administered drugs and SNP-drug interactions and assessed the predictive power of these features on lipid responses. Using cross-validation stratified by patients, we could not validate our hypothesis that models that select a subset of SNP-drug interactions in a principled way have better predictive power than the control models using "random" subsets. Nevertheless, all models tested containing SNP and/or drug terms, exhibited significant predictive power (as compared to a random predictor) and explained a sizable proportion of variance, in the patient stratified cross-validation context. Importantly, the model containing stepwise selected SNP terms showed higher capacity to predict triglyceride levels than a model containing randomly selected SNPs. Dyslipidemia is a complex trait for which many factors remain to be discovered, thus missing from the data, and possibly explaining the limitations of our analysis. In particular, the interactions of drugs with SNPs selected from the set of candidate genes likely have small effect sizes which we were unable to detect in a sample of the present size (<800 patients).In the second part of my thesis, I performed genome-wide association studies within the Cohorte Lausannoise (CoLaus). I have been involved in several international projects to identify SNPs that are associated with various traits, such as serum calcium, body mass index, two-hour glucose levels, as well as metabolic syndrome and its components. These phenotypes are all related to major human health issues, such as cardiovascular disease. I applied statistical methods to detect new variants associated with these phenotypes, contributing to the identification of new genetic loci that may lead to new insights into the genetic basis of these traits. This kind of research will lead to a better understanding of the mechanisms underlying these pathologies, a better evaluation of disease risk, the identification of new therapeutic leads and may ultimately lead to the realization of "personalized" medicine.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Networks are evolving toward a ubiquitous model in which heterogeneousdevices are interconnected. Cryptographic algorithms are required for developing securitysolutions that protect network activity. However, the computational and energy limitationsof network devices jeopardize the actual implementation of such mechanisms. In thispaper, we perform a wide analysis on the expenses of launching symmetric and asymmetriccryptographic algorithms, hash chain functions, elliptic curves cryptography and pairingbased cryptography on personal agendas, and compare them with the costs of basic operatingsystem functions. Results show that although cryptographic power costs are high and suchoperations shall be restricted in time, they are not the main limiting factor of the autonomyof a device.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Les plantes sont essentielles pour les sociétés humaines. Notre alimentation quotidienne, les matériaux de constructions et les sources énergétiques dérivent de la biomasse végétale. En revanche, la compréhension des multiples aspects développementaux des plantes est encore peu exploitée et représente un sujet de recherche majeur pour la science. L'émergence des technologies à haut débit pour le séquençage de génome à grande échelle ou l'imagerie de haute résolution permet à présent de produire des quantités énormes d'information. L'analyse informatique est une façon d'intégrer ces données et de réduire la complexité apparente vers une échelle d'abstraction appropriée, dont la finalité est de fournir des perspectives de recherches ciblées. Ceci représente la raison première de cette thèse. En d'autres termes, nous appliquons des méthodes descriptives et prédictives combinées à des simulations numériques afin d'apporter des solutions originales à des problèmes relatifs à la morphogénèse à l'échelle de la cellule et de l'organe. Nous nous sommes fixés parmi les objectifs principaux de cette thèse d'élucider de quelle manière l'interaction croisée des phytohormones auxine et brassinosteroïdes (BRs) détermine la croissance de la cellule dans la racine du méristème apical d'Arabidopsis thaliana, l'organisme modèle de référence pour les études moléculaires en plantes. Pour reconstruire le réseau de signalement cellulaire, nous avons extrait de la littérature les informations pertinentes concernant les relations entre les protéines impliquées dans la transduction des signaux hormonaux. Le réseau a ensuite été modélisé en utilisant un formalisme logique et qualitatif pour pallier l'absence de données quantitatives. Tout d'abord, Les résultats ont permis de confirmer que l'auxine et les BRs agissent en synergie pour contrôler la croissance de la cellule, puis, d'expliquer des observations phénotypiques paradoxales et au final, de mettre à jour une interaction clef entre deux protéines dans la maintenance du méristème de la racine. Une étude ultérieure chez la plante modèle Brachypodium dystachion (Brachypo- dium) a révélé l'ajustement du réseau d'interaction croisée entre auxine et éthylène par rapport à Arabidopsis. Chez ce dernier, interférer avec la biosynthèse de l'auxine mène à la formation d'une racine courte. Néanmoins, nous avons isolé chez Brachypodium un mutant hypomorphique dans la biosynthèse de l'auxine qui affiche une racine plus longue. Nous avons alors conduit une analyse morphométrique qui a confirmé que des cellules plus anisotropique (plus fines et longues) sont à l'origine de ce phénotype racinaire. Des analyses plus approfondies ont démontré que la différence phénotypique entre Brachypodium et Arabidopsis s'explique par une inversion de la fonction régulatrice dans la relation entre le réseau de signalisation par l'éthylène et la biosynthèse de l'auxine. L'analyse morphométrique utilisée dans l'étude précédente exploite le pipeline de traitement d'image de notre méthode d'histologie quantitative. Pendant la croissance secondaire, la symétrie bilatérale de l'hypocotyle est remplacée par une symétrie radiale et une organisation concentrique des tissus constitutifs. Ces tissus sont initialement composés d'une douzaine de cellules mais peuvent aisément atteindre des dizaines de milliers dans les derniers stades du développement. Cette échelle dépasse largement le seuil d'investigation par les moyens dits 'traditionnels' comme l'imagerie directe de tissus en profondeur. L'étude de ce système pendant cette phase de développement ne peut se faire qu'en réalisant des coupes fines de l'organe, ce qui empêche une compréhension des phénomènes cellulaires dynamiques sous-jacents. Nous y avons remédié en proposant une stratégie originale nommée, histologie quantitative. De fait, nous avons extrait l'information contenue dans des images de très haute résolution de sections transverses d'hypocotyles en utilisant un pipeline d'analyse et de segmentation d'image à grande échelle. Nous l'avons ensuite combiné avec un algorithme de reconnaissance automatique des cellules. Cet outil nous a permis de réaliser une description quantitative de la progression de la croissance secondaire révélant des schémas développementales non-apparents avec une inspection visuelle classique. La formation de pôle de phloèmes en structure répétée et espacée entre eux d'une longueur constante illustre les bénéfices de notre approche. Par ailleurs, l'exploitation approfondie de ces résultats a montré un changement de croissance anisotropique des cellules du cambium et du phloème qui semble en phase avec l'expansion du xylème. Combinant des outils génétiques et de la modélisation biomécanique, nous avons démontré que seule la croissance plus rapide des tissus internes peut produire une réorientation de l'axe de croissance anisotropique des tissus périphériques. Cette prédiction a été confirmée par le calcul du ratio des taux de croissance du xylème et du phloème au cours de développement secondaire ; des ratios élevés sont effectivement observés et concomitant à l'établissement progressif et tangentiel du cambium. Ces résultats suggèrent un mécanisme d'auto-organisation établi par un gradient de division méristématique qui génèrent une distribution de contraintes mécaniques. Ceci réoriente la croissance anisotropique des tissus périphériques pour supporter la croissance secondaire. - Plants are essential for human society, because our daily food, construction materials and sustainable energy are derived from plant biomass. Yet, despite this importance, the multiple developmental aspects of plants are still poorly understood and represent a major challenge for science. With the emergence of high throughput devices for genome sequencing and high-resolution imaging, data has never been so easy to collect, generating huge amounts of information. Computational analysis is one way to integrate those data and to decrease the apparent complexity towards an appropriate scale of abstraction with the aim to eventually provide new answers and direct further research perspectives. This is the motivation behind this thesis work, i.e. the application of descriptive and predictive analytics combined with computational modeling to answer problems that revolve around morphogenesis at the subcellular and organ scale. One of the goals of this thesis is to elucidate how the auxin-brassinosteroid phytohormone interaction determines the cell growth in the root apical meristem of Arabidopsis thaliana (Arabidopsis), the plant model of reference for molecular studies. The pertinent information about signaling protein relationships was obtained through the literature to reconstruct the entire hormonal crosstalk. Due to a lack of quantitative information, we employed a qualitative modeling formalism. This work permitted to confirm the synergistic effect of the hormonal crosstalk on cell elongation, to explain some of our paradoxical mutant phenotypes and to predict a novel interaction between the BREVIS RADIX (BRX) protein and the transcription factor MONOPTEROS (MP),which turned out to be critical for the maintenance of the root meristem. On the same subcellular scale, another study in the monocot model Brachypodium dystachion (Brachypodium) revealed an alternative wiring of auxin-ethylene crosstalk as compared to Arabidopsis. In the latter, increasing interference with auxin biosynthesis results in progressively shorter roots. By contrast, a hypomorphic Brachypodium mutant isolated in this study in an enzyme of the auxin biosynthesis pathway displayed a dramatically longer seminal root. Our morphometric analysis confirmed that more anisotropic cells (thinner and longer) are principally responsible for the mutant root phenotype. Further characterization pointed towards an inverted regulatory logic in the relation between ethylene signaling and auxin biosynthesis in Brachypodium as compared to Arabidopsis, which explains the phenotypic discrepancy. Finally, the morphometric analysis of hypocotyl secondary growth that we applied in this study was performed with the image-processing pipeline of our quantitative histology method. During its secondary growth, the hypocotyl reorganizes its primary bilateral symmetry to a radial symmetry of highly specialized tissues comprising several thousand cells, starting with a few dozens. However, such a scale only permits observations in thin cross-sections, severely hampering a comprehensive analysis of the morphodynamics involved. Our quantitative histology strategy overcomes this limitation. We acquired hypocotyl cross-sections from tiled high-resolution images and extracted their information content using custom high-throughput image processing and segmentation. Coupled with an automated cell type recognition algorithm, it allows precise quantitative characterization of vascular development and reveals developmental patterns that were not evident from visual inspection, for example the steady interspace distance of the phloem poles. Further analyses indicated a change in growth anisotropy of cambial and phloem cells, which appeared in phase with the expansion of xylem. Combining genetic tools and computational modeling, we showed that the reorientation of growth anisotropy axis of peripheral tissue layers only occurs when the growth rate of central tissue is higher than the peripheral one. This was confirmed by the calculation of the ratio of the growth rate xylem to phloem throughout secondary growth. High ratios are indeed observed and concomitant with the homogenization of cambium anisotropy. These results suggest a self-organization mechanism, promoted by a gradient of division in the cambium that generates a pattern of mechanical constraints. This, in turn, reorients the growth anisotropy of peripheral tissues to sustain the secondary growth.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The paper deals with the development and application of the methodology for automatic mapping of pollution/contamination data. General Regression Neural Network (GRNN) is considered in detail and is proposed as an efficient tool to solve this problem. The automatic tuning of isotropic and an anisotropic GRNN model using cross-validation procedure is presented. Results are compared with k-nearest-neighbours interpolation algorithm using independent validation data set. Quality of mapping is controlled by the analysis of raw data and the residuals using variography. Maps of probabilities of exceeding a given decision level and ?thick? isoline visualization of the uncertainties are presented as examples of decision-oriented mapping. Real case study is based on mapping of radioactively contaminated territories.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Este trabajo presenta un Algoritmo Genético (GA) del problema de secuenciar unidades en una línea de producción. Se tiene en cuenta la posibilidad de cambiar la secuencia de piezas mediante estaciones con acceso a un almacén intermedio o centralizado. El acceso al almacén además está restringido, debido al tamaño de las piezas.AbstractThis paper presents a Genetic Algorithm (GA) for the problem of sequencing in a mixed model non-permutation flowshop. Resequencingis permitted where stations have access to intermittent or centralized resequencing buffers. The access to a buffer is restricted by the number of available buffer places and the physical size of the products.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, the theory of hidden Markov models (HMM) isapplied to the problem of blind (without training sequences) channel estimationand data detection. Within a HMM framework, the Baum–Welch(BW) identification algorithm is frequently used to find out maximum-likelihood (ML) estimates of the corresponding model. However, such a procedureassumes the model (i.e., the channel response) to be static throughoutthe observation sequence. By means of introducing a parametric model fortime-varying channel responses, a version of the algorithm, which is moreappropriate for mobile channels [time-dependent Baum-Welch (TDBW)] isderived. Aiming to compare algorithm behavior, a set of computer simulationsfor a GSM scenario is provided. Results indicate that, in comparisonto other Baum–Welch (BW) versions of the algorithm, the TDBW approachattains a remarkable enhancement in performance. For that purpose, onlya moderate increase in computational complexity is needed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Abstract One of the most important issues in molecular biology is to understand regulatory mechanisms that control gene expression. Gene expression is often regulated by proteins, called transcription factors which bind to short (5 to 20 base pairs),degenerate segments of DNA. Experimental efforts towards understanding the sequence specificity of transcription factors is laborious and expensive, but can be substantially accelerated with the use of computational predictions. This thesis describes the use of algorithms and resources for transcriptionfactor binding site analysis in addressing quantitative modelling, where probabilitic models are built to represent binding properties of a transcription factor and can be used to find new functional binding sites in genomes. Initially, an open-access database(HTPSELEX) was created, holding high quality binding sequences for two eukaryotic families of transcription factors namely CTF/NF1 and LEFT/TCF. The binding sequences were elucidated using a recently described experimental procedure called HTP-SELEX, that allows generation of large number (> 1000) of binding sites using mass sequencing technology. For each HTP-SELEX experiments we also provide accurate primary experimental information about the protein material used, details of the wet lab protocol, an archive of sequencing trace files, and assembled clone sequences of binding sequences. The database also offers reasonably large SELEX libraries obtained with conventional low-throughput protocols.The database is available at http://wwwisrec.isb-sib.ch/htpselex/ and and ftp://ftp.isrec.isb-sib.ch/pub/databases/htpselex. The Expectation-Maximisation(EM) algorithm is one the frequently used methods to estimate probabilistic models to represent the sequence specificity of transcription factors. We present computer simulations in order to estimate the precision of EM estimated models as a function of data set parameters(like length of initial sequences, number of initial sequences, percentage of nonbinding sequences). We observed a remarkable robustness of the EM algorithm with regard to length of training sequences and the degree of contamination. The HTPSELEX database and the benchmarked results of the EM algorithm formed part of the foundation for the subsequent project, where a statistical framework called hidden Markov model has been developed to represent sequence specificity of the transcription factors CTF/NF1 and LEF1/TCF using the HTP-SELEX experiment data. The hidden Markov model framework is capable of both predicting and classifying CTF/NF1 and LEF1/TCF binding sites. A covariance analysis of the binding sites revealed non-independent base preferences at different nucleotide positions, providing insight into the binding mechanism. We next tested the LEF1/TCF model by computing binding scores for a set of LEF1/TCF binding sequences for which relative affinities were determined experimentally using non-linear regression. The predicted and experimentally determined binding affinities were in good correlation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper addresses the estimation of the code-phase(pseudorange) and the carrier-phase of the direct signal received from a direct-sequence spread-spectrum satellite transmitter. Thesignal is received by an antenna array in a scenario with interferenceand multipath propagation. These two effects are generallythe limiting error sources in most high-precision positioning applications.A new estimator of the code- and carrier-phases is derivedby using a simplified signal model and the maximum likelihood(ML) principle. The simplified model consists essentially ofgathering all signals, except for the direct one, in a component withunknown spatial correlation. The estimator exploits the knowledgeof the direction-of-arrival of the direct signal and is much simplerthan other estimators derived under more detailed signal models.Moreover, we present an iterative algorithm, that is adequate for apractical implementation and explores an interesting link betweenthe ML estimator and a hybrid beamformer. The mean squarederror and bias of the new estimator are computed for a numberof scenarios and compared with those of other methods. The presentedestimator and the hybrid beamforming outperform the existingtechniques of comparable complexity and attains, in manysituations, the Cramér–Rao lower bound of the problem at hand.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Centrifugal compressors are widely used for example in process industry, oil and gas industry, in small gas turbines and turbochargers. In order to achieve lower consumption of energy and operation costs the efficiency of the compressor needs to be improve. In the present work different pinches and low solidity vaned diffusers were utilized in order to improve the efficiency of a medium size centrifugal compressor. In this study, pinch means the decrement of the diffuser flow passage height. First different geometries were analyzed using computational fluid dynamics. The flow solver Finflo was used to solve the flow field. Finflo is a Navier-Stokes solver. The solver is capable to solve compressible, incompressible, steady and unsteady flow fields. Chien's k-e turbulence model was used. One of the numerically investigated pinched diffuser and one low solidity vaned diffuser were studied experimentally. The overall performance of the compressor and the static pressure distribution before and after the diffuser were measured. The flow entering and leaving the diffuser was measured using a three-hole Cobra-probe and Kiel-probes. The pinch and the low solidity vaned diffuser increased the efficiency of the compressor. Highest isentropic efficiency increment obtained was 3\% of the design isentropic efficiency of the original geometry. It was noticed in the numerical results that the pinch made to the hub and the shroud wall was most beneficial to the operation of the compressor. Also the pinch made to the hub was better than the pinchmade to the shroud. The pinch did not affect the operation range of the compressor, but the low solidity vaned diffuser slightly decreased the operation range.The unsteady phenomena in the vaneless diffuser were studied experimentally andnumerically. The unsteady static pressure was measured at the diffuser inlet and outlet, and time-accurate numerical simulation was conducted. The unsteady static pressure showed that most of the pressure variations lay at the passing frequency of every second blade. The pressure variations did not vanish in the diffuser and were visible at the diffuser outlet. However, the amplitude of the pressure variations decreased in the diffuser. The time-accurate calculations showed quite a good agreement with the measured data. Agreement was very good at the design operation point, even though the computational grid was not dense enough inthe volute and in the exit cone. The time-accurate calculation over-predicted the amplitude of the pressure variations at high flow.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Localization, which is the ability of a mobile robot to estimate its position within its environment, is a key capability for autonomous operation of any mobile robot. This thesis presents a system for indoor coarse and global localization of a mobile robot based on visual information. The system is based on image matching and uses SIFT features as natural landmarks. Features extracted from training images arestored in a database for use in localization later. During localization an image of the scene is captured using the on-board camera of the robot, features are extracted from the image and the best match is searched from the database. Feature matching is done using the k-d tree algorithm. Experimental results showed that localization accuracy increases with the number of training features used in the training database, while, on the other hand, increasing number of features tended to have a negative impact on the computational time. For some parts of the environment the error rate was relatively high due to a strong correlation of features taken from those places across the environment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The key parameters associated to the thermally induced spin crossover process have been calculated for a series of Fe(II) complexes with mono-, bi-, and tridentate ligands. Combination of density functional theory calculations for the geometries and for normal vibrational modes, and highly correlated wave function methods for the energies, allows us to accurately compute the entropy variation associated to the spin transition and the zero-point corrected energy difference between the low- and high-spin states. From these values, the transition temperature, T 1/2, is estimated for different compounds.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Monissasovelluksissa on hyvin tärkeää vähentää valolähteen vaikutusta kohteen oikean värin havainnoimiseksi. Tämä on tarpeen mm. virtuaalisissa museoissa, telelääketieteessä, verkkokaupassa ja verkkorahassa. Tässä tutkielmassa on kehitetty tekniikkaa kirkkaiden heijastusten poistoon spektrikuvista. Työ sisältää katsauksen yleisen värillisen kuvan ymmärtämiseen, mihin perustuen analysoitiin erilaisia kirkkaiden heijastusten poistO'tekniikoita. Työssä kehitettiin uusi kirkkaiden heijastusten poistO'menetelmä, joka perustuu dikromaattiseen heijastus-malliin, joka kuvaa spektrisen datan objektin omaan väriin ja valaisevan valon väriin perustuen. Ehdotettu kirkkaiden heijastusten poistO'menetelmä hyödyntää erilaisia olemassaolevia menetelmiä, kuten pääkomponenttimenetelmää ja tiedon luokittelu-menetelmää. Yritys kehittää nopeasti toimiva algoritmi, joka myös suoriutuu tehtävästä hyvin, on onnistunut. Kokeet toteutettiin ehdotetun menetelmän mukaisesti ja toimivalla algoritmilla saatiin halutut lopputulokset. Edelleentyö sisältää ehdotuksia esitetyn algoritmin parantamiseksi.