921 resultados para visualisation formalism


Relevância:

10.00% 10.00%

Publicador:

Resumo:

L'imagerie par résonance magnétique (IRM) peut fournir aux cardiologues des informations diagnostiques importantes sur l'état de la maladie de l'artère coronarienne dans les patients. Le défi majeur pour l'IRM cardiaque est de gérer toutes les sources de mouvement qui peuvent affecter la qualité des images en réduisant l'information diagnostique. Cette thèse a donc comme but de développer des nouvelles techniques d'acquisitions des images IRM, en changeant les techniques de compensation du mouvement, pour en augmenter l'efficacité, la flexibilité, la robustesse et pour obtenir plus d'information sur le tissu et plus d'information temporelle. Les techniques proposées favorisent donc l'avancement de l'imagerie des coronaires dans une direction plus maniable et multi-usage qui peut facilement être transférée dans l'environnement clinique. La première partie de la thèse s'est concentrée sur l'étude du mouvement des artères coronariennes sur des patients en utilisant la techniques d'imagerie standard (rayons x), pour mesurer la précision avec laquelle les artères coronariennes retournent dans la même position battement après battement (repositionnement des coronaires). Nous avons découvert qu'il y a des intervalles dans le cycle cardiaque, tôt dans la systole et à moitié de la diastole, où le repositionnement des coronaires est au minimum. En réponse nous avons développé une nouvelle séquence d'acquisition (T2-post) capable d'acquérir les données aussi tôt dans la systole. Cette séquence a été testée sur des volontaires sains et on a pu constater que la qualité de visualisation des artère coronariennes est égale à celle obtenue avec les techniques standard. De plus, le rapport signal sur bruit fourni par la séquence d'acquisition proposée est supérieur à celui obtenu avec les techniques d'imagerie standard. La deuxième partie de la thèse a exploré un paradigme d'acquisition des images cardiaques complètement nouveau pour l'imagerie du coeur entier. La technique proposée dans ce travail acquiert les données sans arrêt (free-running) au lieu d'être synchronisée avec le mouvement cardiaque. De cette façon, l'efficacité de la séquence d'acquisition est augmentée de manière significative et les images produites représentent le coeur entier dans toutes les phases cardiaques (quatre dimensions, 4D). Par ailleurs, l'auto-navigation de la respiration permet d'effectuer cette acquisition en respiration libre. Cette technologie rend possible de visualiser et évaluer l'anatomie du coeur et de ses vaisseaux ainsi que la fonction cardiaque en quatre dimensions et avec une très haute résolution spatiale et temporelle, sans la nécessité d'injecter un moyen de contraste. Le pas essentiel qui a permis le développement de cette technique est l'utilisation d'une trajectoire d'acquisition radiale 3D basée sur l'angle d'or. Avec cette trajectoire, il est possible d'acquérir continûment les données d'espace k, puis de réordonner les données et choisir les paramètres temporel des images 4D a posteriori. L'acquisition 4D a été aussi couplée avec un algorithme de reconstructions itératif (compressed sensing) qui permet d'augmenter la résolution temporelle tout en augmentant la qualité des images. Grâce aux images 4D, il est possible maintenant de visualiser les artères coronariennes entières dans chaque phase du cycle cardiaque et, avec les mêmes données, de visualiser et mesurer la fonction cardiaque. La qualité des artères coronariennes dans les images 4D est la même que dans les images obtenues avec une acquisition 3D standard, acquise en diastole Par ailleurs, les valeurs de fonction cardiaque mesurées au moyen des images 4D concorde avec les valeurs obtenues avec les images 2D standard. Finalement, dans la dernière partie de la thèse une technique d'acquisition a temps d'écho ultra-court (UTE) a été développée pour la visualisation in vivo des calcifications des artères coronariennes. Des études récentes ont démontré que les acquisitions UTE permettent de visualiser les calcifications dans des plaques athérosclérotiques ex vivo. Cepandent le mouvement du coeur a entravé jusqu'à maintenant l'utilisation des techniques UTE in vivo. Pour résoudre ce problème nous avons développé une séquence d'acquisition UTE avec trajectoire radiale 3D et l'avons testée sur des volontaires. La technique proposée utilise une auto-navigation 3D pour corriger le mouvement respiratoire et est synchronisée avec l'ECG. Trois échos sont acquis pour extraire le signal de la calcification avec des composants au T2 très court tout en permettant de séparer le signal de la graisse depuis le signal de l'eau. Les résultats sont encore préliminaires mais on peut affirmer que la technique développé peut potentiellement montrer les calcifications des artères coronariennes in vivo. En conclusion, ce travail de thèse présente trois nouvelles techniques pour l'IRM du coeur entier capables d'améliorer la visualisation et la caractérisation de la maladie athérosclérotique des coronaires. Ces techniques fournissent des informations anatomiques et fonctionnelles en quatre dimensions et des informations sur la composition du tissu auparavant indisponibles. CORONARY artery magnetic resonance imaging (MRI) has the potential to provide the cardiologist with relevant diagnostic information relative to coronary artery disease of patients. The major challenge of cardiac MRI, though, is dealing with all sources of motions that can corrupt the images affecting the diagnostic information provided. The current thesis, thus, focused on the development of new MRI techniques that change the standard approach to cardiac motion compensation in order to increase the efficiency of cardioavscular MRI, to provide more flexibility and robustness, new temporal information and new tissue information. The proposed approaches help in advancing coronary magnetic resonance angiography (MRA) in the direction of an easy-to-use and multipurpose tool that can be translated to the clinical environment. The first part of the thesis focused on the study of coronary artery motion through gold standard imaging techniques (x-ray angiography) in patients, in order to measure the precision with which the coronary arteries assume the same position beat after beat (coronary artery repositioning). We learned that intervals with minimal coronary artery repositioning occur in peak systole and in mid diastole and we responded with a new pulse sequence (T2~post) that is able to provide peak-systolic imaging. Such a sequence was tested in healthy volunteers and, from the image quality comparison, we learned that the proposed approach provides coronary artery visualization and contrast-to-noise ratio (CNR) comparable with the standard acquisition approach, but with increased signal-to-noise ratio (SNR). The second part of the thesis explored a completely new paradigm for whole- heart cardiovascular MRI. The proposed techniques acquires the data continuously (free-running), instead of being triggered, thus increasing the efficiency of the acquisition and providing four dimensional images of the whole heart, while respiratory self navigation allows for the scan to be performed in free breathing. This enabling technology allows for anatomical and functional evaluation in four dimensions, with high spatial and temporal resolution and without the need for contrast agent injection. The enabling step is the use of a golden-angle based 3D radial trajectory, which allows for a continuous sampling of the k-space and a retrospective selection of the timing parameters of the reconstructed dataset. The free-running 4D acquisition was then combined with a compressed sensing reconstruction algorithm that further increases the temporal resolution of the 4D dataset, while at the same time increasing the overall image quality by removing undersampling artifacts. The obtained 4D images provide visualization of the whole coronary artery tree in each phases of the cardiac cycle and, at the same time, allow for the assessment of the cardiac function with a single free- breathing scan. The quality of the coronary arteries provided by the frames of the free-running 4D acquisition is in line with the one obtained with the standard ECG-triggered one, and the cardiac function evaluation matched the one measured with gold-standard stack of 2D cine approaches. Finally, the last part of the thesis focused on the development of ultrashort echo time (UTE) acquisition scheme for in vivo detection of calcification in the coronary arteries. Recent studies showed that UTE imaging allows for the coronary artery plaque calcification ex vivo, since it is able to detect the short T2 components of the calcification. The heart motion, though, prevented this technique from being applied in vivo. An ECG-triggered self-navigated 3D radial triple- echo UTE acquisition has then been developed and tested in healthy volunteers. The proposed sequence combines a 3D self-navigation approach with a 3D radial UTE acquisition enabling data collection during free breathing. Three echoes are simultaneously acquired to extract the short T2 components of the calcification while a water and fat separation technique allows for proper visualization of the coronary arteries. Even though the results are still preliminary, the proposed sequence showed great potential for the in vivo visualization of coronary artery calcification. In conclusion, the thesis presents three novel MRI approaches aimed at improved characterization and assessment of atherosclerotic coronary artery disease. These approaches provide new anatomical and functional information in four dimensions, and support tissue characterization for coronary artery plaques.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Many models proposed to study the evolution of collective action rely on a formalism that represents social interactions as n-player games between individuals adopting discrete actions such as cooperate and defect. Despite the importance of spatial structure in biological collective action, the analysis of n-player games games in spatially structured populations has so far proved elusive. We address this problem by considering mixed strategies and by integrating discrete-action n-player games into the direct fitness approach of social evolution theory. This allows to conveniently identify convergence stable strategies and to capture the effect of population structure by a single structure coefficient, namely, the pairwise (scaled) relatedness among interacting individuals. As an application, we use our mathematical framework to investigate collective action problems associated with the provision of three different kinds of collective goods, paradigmatic of a vast array of helping traits in nature: "public goods" (both providers and shirkers can use the good, e.g., alarm calls), "club goods" (only providers can use the good, e.g., participation in collective hunting), and "charity goods" (only shirkers can use the good, e.g., altruistic sacrifice). We show that relatedness promotes the evolution of collective action in different ways depending on the kind of collective good and its economies of scale. Our findings highlight the importance of explicitly accounting for relatedness, the kind of collective good, and the economies of scale in theoretical and empirical studies of the evolution of collective action.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We conducted a study assessing the quality and speed of intubation between the Airtraq with its new iPhone AirView app and the King Vision in a manikin. The primary endpoint was reduction of time needed for intubation. Secondary endpoints included times necessary for intubation. 30 anaesthetists randomly performed 3 intubations with each device on a difficult airway manikin. Participants had a professional experience of 12 years: 60.0% possessed the Airtraq in their hospital, 46.7% the King Vision, and 20.0% both. Median time difference [IQR] to identify glottis (1.1 [-1.3; 3.9] P = 0.019), for tube insertion (2.1 [-2.6; 9.4] P = 0.002) and lung ventilation (2.8 [-2.4; 11.5] P = 0.001), was shorter with the Airtraq-AirView. Median time for glottis visualization was significantly shorter with the Airtraq-AirView (5.3 [4.0; 8.4] versus 6.4 [4.6; 9.1]). Cormack Lehane before intubation was better with the King Vision (P = 0.03); no difference was noted during intubation, for subjective device insertion or quality of epiglottis visualisation. Assessment of tracheal tube insertion was better with the Airtraq-AirView. The Airtraq-AirView allows faster identification of the landmarks and intubation in a difficult airway manikin, while clinical relevance remains to be studied. Anaesthetists assessed the intubation better with the Airtraq-AirView.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We study the interaction of vector mesons with the octet of stable baryons in the framework of the local hidden gauge formalism using a coupled channels unitary approach. We examine the scattering amplitudes and their poles, which can be associated to known J P = 1/2- , 3/2- baryon resonances, in some cases, or give predictions in other ones. The formalism employed produces doublets of degenerate J P = 1/2- , 3/2- states, a pattern which is observed experimentally in several cases. The findings of this work should also be useful to guide present experimental programs searching for new resonances, in particular in the strange sector where the current information is very poor.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Post-testicular sperm maturation occurs in the epididymis. The ion concentration and proteins secreted into the epididymal lumen, together with testicular factors, are believed to be responsible for the maturation of spermatozoa. Disruption of the maturation of spermatozoa in the epididymis provides a promising strategy for generating a male contraceptive. However, little is known about the proteins involved. For drug development, it is also essential to have tools to study the function of these proteins in vitro. One approach for screening novel targets is to study the secretory products of the epididymis or the G protein-coupled receptors (GPCRs) that are involved in the maturation process of the spermatozoa. The modified Ca2+ imaging technique to monitor release from PC12 pheochromocytoma cells can also be applied to monitor secretory products involved in the maturational processes of spermatozoa. PC12 pheochromocytoma cells were chosen for evaluation of this technique as they release catecholamines from their cell body, thus behaving like endocrine secretory cells. The results of the study demonstrate that depolarisation of nerve growth factor -differentiated PC12 cells releases factors which activate nearby randomly distributed HEL erythroleukemia cells. Thus, during the release process, the ligands reach concentrations high enough to activate receptors even in cells some distance from the release site. This suggests that communication between randomly dispersed cells is possible even if the actual quantities of transmitter released are extremely small. The development of a novel method to analyse GPCR-dependent Ca2+ signalling in living slices of mouse caput epididymis is an additional tool for screening for drug targets. By this technique it was possible to analyse functional GPCRs in the epithelial cells of the ductus epididymis. The results revealed that, both P2X- and P2Y-type purinergic receptors are responsible for the rapid and transient Ca2+ signal detected in the epithelial cells of caput epididymides. Immunohistochemical and reverse transcriptase-polymerase chain reaction (RTPCR) analyses showed the expression of at least P2X1, P2X2, P2X4 and P2X7, and P2Y1 and P2Y2 receptors in the epididymis. Searching for epididymis-specific promoters for transgene delivery into the epididymis is of key importance for the development of specific models for drug development. We used EGFP as the reporter gene to identify proper promoters to deliver transgenes into the epithelial cells of the mouse epididymis in vivo. Our results revealed that the 5.0 kb murine Glutathione peroxidase 5 (GPX5) promoter can be used to target transgene expression into the epididymis while the 3.8 kb Cysteine-rich secretory protein-1 (CRISP-1) promoter can be used to target transgene expression into the testis. Although the visualisation of EGFP in living cells in culture usually poses few problems, the detection of EGFP in tissue sections can be more difficult because soluble EGFP molecules can be lost if the cell membrane is damaged by freezing, sectioning, or permeabilisation. Furthermore, the fluorescence of EGFP is dependent on its conformation. Therefore, fixation protocols that immobilise EGFP may also destroy its usefulness as a fluorescent reporter. We therefore developed a novel tissue preparation and preservation techniques for EGFP. In addition, fluorescence spectrophotometry with epididymal epithelial cells in suspension revealed the expression of functional purinergic, adrenergic, cholinergic and bradykinin receptors in these cell lines (mE-Cap27 and mE-Cap28). In conclusion, we developed new tools for studying the role of the epididymis in sperm maturation. We developed a new technique to analyse GPCR dependent Ca2+ signalling in living slices of mouse caput epididymis. In addition, we improved the method of detecting reporter gene expression. Furthermore, we characterised two epididymis-specific gene promoters, analysed the expression of GPCRs in epididymal epithelial cells and developed a novel technique for measurement of secretion from cells.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Opinnäytetyössä tarkastellaan ensikertalaisen tekemää musiikkivideoprojektia ammattilaisten musiikkivideon tekemiseen. Toimin teososassa käsikirjoittajana, ohjaajana ja tuottajana. Työssä kerrotaan musiikkivideon historiasta ja kehityksestä maailmalla ja Suomessa lyhyesti. Käydään koko musiikkivideon tuotantokaari kappaleen valinnasta, käsikirjoituksesta, kuvakerronnan suunnittelusta, työryhmän ja esiintyjien valitsemisesta aina kuvauspäiviin ja jälkikäsittelyyn. Jokaista eri tuotantovaihetta verrataan ammattilaisten musiikkivideon tekemiseen. Pohditaan, miten saatiin tehtyä koko tuotantokaaren osa-alueet ammattimaisesti, vaikka budjetin olemattomuus asetti haasteita prosessin tekemiselle. Musiikkivideon tarkoitus on promotoida artistia tai yhtyettä ja luoda heille jonkinlainen imago. Artistin tyylin ratkaisee lähinnä musiikki, jollaista hän esittää. Työ vertailee erilaisia musiikkivideotyylejä ja pohtii, miksi kyseisessä musiikkivideossa oli valittu juuri tarinallinen tyyli. Tutkitaan myös miten kuvaussuunnittelu onnistui musiikkivideon tarinan näkökulmasta. Vaihe vaiheelta käydään läpi kuvauspäivien tapahtumat. Kerrotaan, millaisia ongelmia prosessi tuotti ja miten ne saatiin ratkaistua. - Opinnäytetyössä on liitteenä musiikkivideon käsikirjoitus, kuvausaikataulu ja osa kuvakäsikirjoituksesta. - Opinnäytetyöhön kuuluu teososa, musiikkivideo Mitä jos...

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Biomedical research is currently facing a new type of challenge: an excess of information, both in terms of raw data from experiments and in the number of scientific publications describing their results. Mirroring the focus on data mining techniques to address the issues of structured data, there has recently been great interest in the development and application of text mining techniques to make more effective use of the knowledge contained in biomedical scientific publications, accessible only in the form of natural human language. This thesis describes research done in the broader scope of projects aiming to develop methods, tools and techniques for text mining tasks in general and for the biomedical domain in particular. The work described here involves more specifically the goal of extracting information from statements concerning relations of biomedical entities, such as protein-protein interactions. The approach taken is one using full parsing—syntactic analysis of the entire structure of sentences—and machine learning, aiming to develop reliable methods that can further be generalized to apply also to other domains. The five papers at the core of this thesis describe research on a number of distinct but related topics in text mining. In the first of these studies, we assessed the applicability of two popular general English parsers to biomedical text mining and, finding their performance limited, identified several specific challenges to accurate parsing of domain text. In a follow-up study focusing on parsing issues related to specialized domain terminology, we evaluated three lexical adaptation methods. We found that the accurate resolution of unknown words can considerably improve parsing performance and introduced a domain-adapted parser that reduced the error rate of theoriginal by 10% while also roughly halving parsing time. To establish the relative merits of parsers that differ in the applied formalisms and the representation given to their syntactic analyses, we have also developed evaluation methodology, considering different approaches to establishing comparable dependency-based evaluation results. We introduced a methodology for creating highly accurate conversions between different parse representations, demonstrating the feasibility of unification of idiverse syntactic schemes under a shared, application-oriented representation. In addition to allowing formalism-neutral evaluation, we argue that such unification can also increase the value of parsers for domain text mining. As a further step in this direction, we analysed the characteristics of publicly available biomedical corpora annotated for protein-protein interactions and created tools for converting them into a shared form, thus contributing also to the unification of text mining resources. The introduced unified corpora allowed us to perform a task-oriented comparative evaluation of biomedical text mining corpora. This evaluation established clear limits on the comparability of results for text mining methods evaluated on different resources, prompting further efforts toward standardization. To support this and other research, we have also designed and annotated BioInfer, the first domain corpus of its size combining annotation of syntax and biomedical entities with a detailed annotation of their relationships. The corpus represents a major design and development effort of the research group, with manual annotation that identifies over 6000 entities, 2500 relationships and 28,000 syntactic dependencies in 1100 sentences. In addition to combining these key annotations for a single set of sentences, BioInfer was also the first domain resource to introduce a representation of entity relations that is supported by ontologies and able to capture complex, structured relationships. Part I of this thesis presents a summary of this research in the broader context of a text mining system, and Part II contains reprints of the five included publications.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The 4πβ-γ coincidence counting method and its close relatives are widely used for the primary standardization of radioactivity. Both the general formalism and specific implementation of these methods have been well-documented. In particular, previous papers contain the extrapolation equations used for various decay schemes, methods for determining model parameters and, in some cases, tabulated uncertainty budgets. Two things often lacking from experimental reports are both the rationale for estimating uncertainties in a specific way and the details of exactly how a specific component of uncertainty was estimated. Furthermore, correlations among the components of uncertainty are rarely mentioned. To fill in these gaps, the present article shares the best-practices from a few practitioners of this craft. We explain and demonstrate with examples of how these approaches can be used to estimate the uncertainty of the reported massic activity. We describe uncertainties due to measurement variability, extrapolation functions, dead-time and resolving-time effects, gravimetric links, and nuclear and atomic data. Most importantly, a thorough understanding of the measurement system and its response to the decay under study can be used to derive a robust estimate of the measurement uncertainty.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis develops a comprehensive and a flexible statistical framework for the analysis and detection of space, time and space-time clusters of environmental point data. The developed clustering methods were applied in both simulated datasets and real-world environmental phenomena; however, only the cases of forest fires in Canton of Ticino (Switzerland) and in Portugal are expounded in this document. Normally, environmental phenomena can be modelled as stochastic point processes where each event, e.g. the forest fire ignition point, is characterised by its spatial location and occurrence in time. Additionally, information such as burned area, ignition causes, landuse, topographic, climatic and meteorological features, etc., can also be used to characterise the studied phenomenon. Thereby, the space-time pattern characterisa- tion represents a powerful tool to understand the distribution and behaviour of the events and their correlation with underlying processes, for instance, socio-economic, environmental and meteorological factors. Consequently, we propose a methodology based on the adaptation and application of statistical and fractal point process measures for both global (e.g. the Morisita Index, the Box-counting fractal method, the multifractal formalism and the Ripley's K-function) and local (e.g. Scan Statistics) analysis. Many measures describing the space-time distribution of environmental phenomena have been proposed in a wide variety of disciplines; nevertheless, most of these measures are of global character and do not consider complex spatial constraints, high variability and multivariate nature of the events. Therefore, we proposed an statistical framework that takes into account the complexities of the geographical space, where phenomena take place, by introducing the Validity Domain concept and carrying out clustering analyses in data with different constrained geographical spaces, hence, assessing the relative degree of clustering of the real distribution. Moreover, exclusively to the forest fire case, this research proposes two new methodologies to defining and mapping both the Wildland-Urban Interface (WUI) described as the interaction zone between burnable vegetation and anthropogenic infrastructures, and the prediction of fire ignition susceptibility. In this regard, the main objective of this Thesis was to carry out a basic statistical/- geospatial research with a strong application part to analyse and to describe complex phenomena as well as to overcome unsolved methodological problems in the characterisation of space-time patterns, in particular, the forest fire occurrences. Thus, this Thesis provides a response to the increasing demand for both environmental monitoring and management tools for the assessment of natural and anthropogenic hazards and risks, sustainable development, retrospective success analysis, etc. The major contributions of this work were presented at national and international conferences and published in 5 scientific journals. National and international collaborations were also established and successfully accomplished. -- Cette thèse développe une méthodologie statistique complète et flexible pour l'analyse et la détection des structures spatiales, temporelles et spatio-temporelles de données environnementales représentées comme de semis de points. Les méthodes ici développées ont été appliquées aux jeux de données simulées autant qu'A des phénomènes environnementaux réels; nonobstant, seulement le cas des feux forestiers dans le Canton du Tessin (la Suisse) et celui de Portugal sont expliqués dans ce document. Normalement, les phénomènes environnementaux peuvent être modélisés comme des processus ponctuels stochastiques ou chaque événement, par ex. les point d'ignition des feux forestiers, est déterminé par son emplacement spatial et son occurrence dans le temps. De plus, des informations tels que la surface bru^lée, les causes d'ignition, l'utilisation du sol, les caractéristiques topographiques, climatiques et météorologiques, etc., peuvent aussi être utilisées pour caractériser le phénomène étudié. Par conséquent, la définition de la structure spatio-temporelle représente un outil puissant pour compren- dre la distribution du phénomène et sa corrélation avec des processus sous-jacents tels que les facteurs socio-économiques, environnementaux et météorologiques. De ce fait, nous proposons une méthodologie basée sur l'adaptation et l'application de mesures statistiques et fractales des processus ponctuels d'analyse global (par ex. l'indice de Morisita, la dimension fractale par comptage de boîtes, le formalisme multifractal et la fonction K de Ripley) et local (par ex. la statistique de scan). Des nombreuses mesures décrivant les structures spatio-temporelles de phénomènes environnementaux peuvent être trouvées dans la littérature. Néanmoins, la plupart de ces mesures sont de caractère global et ne considèrent pas de contraintes spatiales com- plexes, ainsi que la haute variabilité et la nature multivariée des événements. A cet effet, la méthodologie ici proposée prend en compte les complexités de l'espace géographique ou le phénomène a lieu, à travers de l'introduction du concept de Domaine de Validité et l'application des mesures d'analyse spatiale dans des données en présentant différentes contraintes géographiques. Cela permet l'évaluation du degré relatif d'agrégation spatiale/temporelle des structures du phénomène observé. En plus, exclusif au cas de feux forestiers, cette recherche propose aussi deux nouvelles méthodologies pour la définition et la cartographie des zones périurbaines, décrites comme des espaces anthropogéniques à proximité de la végétation sauvage ou de la forêt, et de la prédiction de la susceptibilité à l'ignition de feu. A cet égard, l'objectif principal de cette Thèse a été d'effectuer une recherche statistique/géospatiale avec une forte application dans des cas réels, pour analyser et décrire des phénomènes environnementaux complexes aussi bien que surmonter des problèmes méthodologiques non résolus relatifs à la caractérisation des structures spatio-temporelles, particulièrement, celles des occurrences de feux forestières. Ainsi, cette Thèse fournit une réponse à la demande croissante de la gestion et du monitoring environnemental pour le déploiement d'outils d'évaluation des risques et des dangers naturels et anthro- pogéniques. Les majeures contributions de ce travail ont été présentées aux conférences nationales et internationales, et ont été aussi publiées dans 5 revues internationales avec comité de lecture. Des collaborations nationales et internationales ont été aussi établies et accomplies avec succès.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Defining digital humanities might be an endless debate if we stick to the discussion about the boundaries of this concept as an academic "discipline". In an attempt to concretely identify this field and its actors, this paper shows that it is possible to analyse them through Twitter, a social media widely used by this "community of practice". Based on a network analysis of 2,500 users identified as members of this movement, the visualisation of the "who's following who?" graph allows us to highlight the structure of the network's relationships, and identify users whose position is particular. Specifically, we show that linguistic groups are key factors to explain clustering within a network whose characteristics look similar to a small world.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Que ce soit d'un point de vue, urbanistique, social, ou encore de la gouvernance, l'évolution des villes est un défi majeur de nos sociétés contemporaines. En offrant la possibilité d'analyser des configurations spatiales et sociales existantes ou en tentant de simuler celles à venir, les systèmes d'information géographique sont devenus incontournables dans la gestion et dans la planification urbaine. En cinq ans la population de la ville de Lausanne est passée de 134'700 à 140'570 habitants, alors que les effectifs de l'école publique ont crû de 12'200 à 13'500 élèves. Cet accroissement démographique associé à un vaste processus d'harmonisation de la scolarité obligatoire en Suisse ont amené le Service des écoles à mettre en place et à développer en collaboration avec l'université de Lausanne des solutions SIG à même de répondre à différentes problématiques spatiales. Établies en 1989, les limites des établissements scolaires (bassins de recrutement) ont dû être redéfinies afin de les réadapter aux réalités d'un paysage urbain et politique en pleine mutation. Dans un contexte de mobilité et de durabilité, un système d'attribution de subventions pour les transports publics basé sur la distance domicile-école et sur l'âge des écoliers, a été conçu. La réalisation de ces projets a nécessité la construction de bases de données géographiques ainsi que l'élaboration de nouvelles méthodes d'analyses exposées dans ce travail. Cette thèse s'est ainsi faite selon une dialectique permanente entre recherches théoriques et nécessités pratiques. La première partie de ce travail porte sur l'analyse du réseau piéton de la ville. La morphologie du réseau est investiguée au travers d'approches multi-échelles du concept de centralité. La première conception, nommée sinuo-centralité ("straightness centrality"), stipule qu'être central c'est être relié aux autres en ligne droite. La deuxième, sans doute plus intuitive, est intitulée centricité ("closeness centrality") et exprime le fait qu'être central c'est être proche des autres (fig. 1, II). Les méthodes développées ont pour but d'évaluer la connectivité et la marchabilité du réseau, tout en suggérant de possibles améliorations (création de raccourcis piétons). Le troisième et dernier volet théorique expose et développe un algorithme de transport optimal régularisé. En minimisant la distance domicile-école et en respectant la taille des écoles, l'algorithme permet de réaliser des scénarios d'enclassement. L'implémentation des multiplicateurs de Lagrange offre une visualisation du "coût spatial" des infrastructures scolaires et des lieux de résidence des écoliers. La deuxième partie de cette thèse retrace les aspects principaux de trois projets réalisés dans le cadre de la gestion scolaire. À savoir : la conception d'un système d'attribution de subventions pour les transports publics, la redéfinition de la carte scolaire, ou encore la simulation des flux d'élèves se rendant à l'école à pied. *** May it be from an urbanistic, a social or from a governance point of view, the evolution of cities is a major challenge in our contemporary societies. By giving the opportunity to analyse spatial and social configurations or attempting to simulate future ones, geographic information systems cannot be overlooked in urban planning and management. In five years, the population of the city of Lausanne has grown from 134'700 to 140'570 inhabitants while the numbers in public schools have increased from 12'200 to 13'500 students. Associated to a considerable harmonisation process of compulsory schooling in Switzerland, this demographic rise has driven schooling services, in collaboration with the University of Lausanne, to set up and develop GIS capable of tackling various spatial issues. Established in 1989, the school districts had to be altered so that they might fit the reality of a continuously changing urban and political landscape. In a context of mobility and durability, an attribution system for public transport subventions based on the distance between residence and school and on the age of the students was designed. The implementation of these projects required the built of geographical databases as well as the elaboration of new analysis methods exposed in this thesis. The first part of this work focuses on the analysis of the city's pedestrian network. Its morphology is investigated through multi-scale approaches of the concept of centrality. The first conception, named the straightness centrality, stipulates that being central is being connected to the others in a straight line. The second, undoubtedly more intuitive, is called closeness centrality and expresses the fact that being central is being close to the others. The goal of the methods developed is to evaluate the connectivity and walkability of the network along with suggesting possible improvements (creation of pedestrian shortcuts).The third and final theoretical section exposes and develops an algorithm of regularised optimal transport. By minimising home to school distances and by respecting school capacity, the algorithm enables the production of student allocation scheme. The implementation of the Lagrange multipliers offers a visualisation of the spatial cost associated to the schooling infrastructures and to the student home locations. The second part of this thesis recounts the principal aspects of three projects fulfilled in the context of school management. It focuses namely on the built of an attribution system for public transport subventions, a school redistricting process and on simulating student pedestrian flows.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This review has tried to collect and correlate all the various equations for the g matrix of strong field d5 systems obtained from different basis sets using full electron and hole formalism calculations. It has corrected mistakes found in the literature and shown how the failure to properly take in symmetry boundary conditions has produced a variety of apparently inconsistent equations in the literature. The review has reexamined the problem of spin-orbit interaction with excited t4e states and finds that the earlier reports that it is zero in octahedral symmetry is not correct. It has shown how redefining what x, y, and z are in the principal coordinate system simplifies, compared to previous methods, the analysis of experimental g values with the equations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper deals with Carathédory's formulation of the second law of thermodynamics. The material is presented in a didatical way, which allows a second year undergraduate student to follow the formalism. An application is made to an ideal gas with two independent variables. A criticism to Carnot formulation of the second law and an investigation of the historical origins of the Carathéodory formalism are also presented.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Statistical mechanics Monte Carlo simulation is reviewed as a formalism to study thermodynamic properties of liquids. Considering the importance of free energy changes in chemical processes, the thermodynamic perturbation theory implemented in the Monte Carlo method is discussed. The representation of molecular interaction by the Lennard-Jones and Coulomb potential functions is also discussed. Charges derived from quantum molecular electrostatic potential are also discussed as an useful methodology to generate an adequate set of partial charges to be used in liquid simulation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Consensus is gathering that antimicrobial peptides that exert their antibacterial action at the membrane level must reach a local concentration threshold to become active. Studies of peptide interaction with model membranes do identify such disruptive thresholds but demonstrations of the possible correlation of these with the in vivo onset of activity have only recently been proposed. In addition, such thresholds observed in model membranes occur at local peptide concentrations close to full membrane coverage. In this work we fully develop an interaction model of antimicrobial peptides with biological membranes; by exploring the consequences of the underlying partition formalism we arrive at a relationship that provides antibacterial activity prediction from two biophysical parameters: the affinity of the peptide to the membrane and the critical bound peptide to lipid ratio. A straightforward and robust method to implement this relationship, with potential application to high-throughput screening approaches, is presented and tested. In addition, disruptive thresholds in model membranes and the onset of antibacterial peptide activity are shown to occur over the same range of locally bound peptide concentrations (10 to 100 mM), which conciliates the two types of observations