937 resultados para tool skype
Resumo:
Optimization of wave functions in quantum Monte Carlo is a difficult task because the statistical uncertainty inherent to the technique makes the absolute determination of the global minimum difficult. To optimize these wave functions we generate a large number of possible minima using many independently generated Monte Carlo ensembles and perform a conjugate gradient optimization. Then we construct histograms of the resulting nominally optimal parameter sets and "filter" them to identify which parameter sets "go together" to generate a local minimum. We follow with correlated-sampling verification runs to find the global minimum. We illustrate this technique for variance and variational energy optimization for a variety of wave functions for small systellls. For such optimized wave functions we calculate the variational energy and variance as well as various non-differential properties. The optimizations are either on par with or superior to determinations in the literature. Furthermore, we show that this technique is sufficiently robust that for molecules one may determine the optimal geometry at tIle same time as one optimizes the variational energy.
Resumo:
The purpose of this project is to provide social service practitioners with tools and perspectives to engage young people in a process of developing and connecting with their own personal narratives, and storytelling with others. This project extensively reviews the literature to explore Why Story, What Is Story, Future Directions of Story, and Challenges of Story. Anchoring this exploration is Freire’s (1970/2000) intentional uncovering and decoding. Taking a phenomenological approach, I draw additionally on Brookfield’s (1995) critical reflection; Delgado (1989) and McLaren (1998) for subversive narrative; and Robin (2008) and Sadik (2008) for digital storytelling. The recommendations provided within this project include a practical model built upon Baxter Magolda and King’s (2004) process towards self-authorship for engaging an exercise of storytelling that is accessible to practitioners and young people alike. A personal narrative that aims to help connect lived experience with the theoretical content underscores this project. I call for social service practitioners to engage their own personal narratives in an inclusive and purposeful storytelling method that enhances their ability to help the young people they serve develop and share their stories.
Resumo:
Affiliation: Centre Robert-Cedergren de l'Université de Montréal en bio-informatique et génomique & Département de biochimie, Université de Montréal
Resumo:
Affiliation: Département de Biochimie, Université de Montréal
Resumo:
[Support Institutions:] Department of Administration of Health, University of Montreal, Canada Public Health School of Fudan University, Shanghai, China
Resumo:
L'assemblage des nucléosomes est étroitement couplée à la synthèse des histones ainsi qu’à la réplication et la réparation de l’ADN durant la phase S. Ce processus implique un mécanisme de contrôle qui contribue soigneusement et de manière régulée à l’assemblage de l’ADN en chromatine. L'assemblage des nucléosomes durant la synthèse de l’ADN est crucial et contribue ainsi au maintien de la stabilité génomique. Cette thèse décrit la caractérisation par spectrométrie de masse(SM) des protéines jouant un rôle critique dans l’assemblage et le maintien de la structure chromatinienne. Plus précisément, la phosphorylation de deux facteurs d’assemblage des nucléosome, le facteur CAF-1, une chaperone d’histone qui participe à l'assemblage de la chromatine spécifiquement couplée à la réplication de l'ADN, ainsi que le complexe protéique Hir, jouant de plus un rôle important dans la régulation transcriptionelle des gènes d’histones lors de la progression normale du cycle cellulaire et en réponse aux dommages de l'ADN, a été examiné. La caractérisation des sites de phosphorylation par SM nécéssite la séparation des protéines par éléctrophorèse suivi d’une coloration a l’argent. Dans le chapitre 2, nous demontrons que la coloration à l’argent induit un artéfact de sulfatation. Plus précisément, cet artéfact est causé par un réactif spécifiquement utilisé lors de la coloration. La sulfatation présente de fortes similitudes avec la phosphorylation. Ainsi, l’incrément de masse observé sur les peptides sulfatés et phosphorylés (+80 Da) nécéssite des instruments offrant une haute résolution et haute précision de masse pour différencier ces deux modifications. Dans les chapitres 3 et 4, nous avons d’abord démontré par SM que Cac1, la plus grande sous-unité du facteur CAF-1, est cible de plusieurs sites de phosphorylation. Fait intéréssant, certains de ces sites contiennent des séquences consensus pour les kinases Cdc7-Dbf4 et CDKs. Ainsi, ces résultats fournissent les premières évidences que CAF-1 est potentiellement régulé par ces deux kinases in vivo. La fonction de tous les sites de phosphorylation identifiés a ensuite été évaluée. Nous avons démontré que la phosphorylation de la Ser-503, un site consensus de la DDK, est essentielle à la répréssion transcriptionelle des gènes au niveau des télomères. Cependant, cette phosphorylation ne semble pas être nécéssaire pour d’autres fonctions connues de CAF-1, indiquant que le blocage de la phsophorylation de Cac1 Ser-503 affecte spécifiquement la fonction de CAF-1 aux structures hétérochromatiques des télomères. Ensuite, nous avons identifiés une intéraction physique entre CAF-1 et Cdc7-Dbf4. Des études in vitro ont également demontré que cette kinase phosphoryle spécifiquement Cac1 Ser-503, suggérant un rôle potential pour la kinase Cdc7-Dbf4 dans l’assemblage et la stabilité de la structure hétérochromatique aux télomères. Finalement, les analyses par SM nous ont également permi de montrer que la sous-unité Hpc2 du complexe Hir est phosphorylée sur plusieurs sites consensus des CDKs et de Cdc7-Dbf4. De plus, la quantification par SM d’un site spécifique de phosphorylation de Hpc2, la Ser-330, s’est révélée être fortement induite suite à l’activation du point de contrôle de réplication (le “checkpoint”) suite au dommage a l’ADN. Nous montrons que la Ser-330 de Hpc2 est phopshorylée par les kinases de point de contrôle de manière Mec1/Tel1- et Rad53-dépendante. Nos données préliminaires suggèrent ainsi que la capacité du complex Hir de réguler la répréssion transcriptionelle des gènes d'histones lors de la progression du cycle cellulaire normal et en réponse au dommage de l'ADN est médiée par la phosphorylation de Hpc2 par ces deux kinases. Enfin, ces deux études mettent en évidence l'importance de la spectrométrie de masse dans la caractérisation des sites de phosphorylation des protéines, nous permettant ainsi de comprendre plus précisement les mécanismes de régulation de l'assemblage de la chromatine et de la synthèse des histones.
Resumo:
Mémoire numérisé par la Division de la gestion de documents et des archives de l'Université de Montréal
Advances in therapeutic risk management through signal detection and risk minimisation tool analyses
Resumo:
Les quatre principales activités de la gestion de risque thérapeutique comportent l’identification, l’évaluation, la minimisation, et la communication du risque. Ce mémoire aborde les problématiques liées à l’identification et à la minimisation du risque par la réalisation de deux études dont les objectifs sont de: 1) Développer et valider un outil de « data mining » pour la détection des signaux à partir des banques de données de soins de santé du Québec; 2) Effectuer une revue systématique afin de caractériser les interventions de minimisation de risque (IMR) ayant été implantées. L’outil de détection de signaux repose sur la méthode analytique du quotient séquentiel de probabilité (MaxSPRT) en utilisant des données de médicaments délivrés et de soins médicaux recueillis dans une cohorte rétrospective de 87 389 personnes âgées vivant à domicile et membres du régime d’assurance maladie du Québec entre les années 2000 et 2009. Quatre associations « médicament-événement indésirable (EI) » connues et deux contrôles « négatifs » ont été utilisés. La revue systématique a été faite à partir d’une revue de la littérature ainsi que des sites web de six principales agences réglementaires. La nature des RMIs ont été décrites et des lacunes de leur implémentation ont été soulevées. La méthode analytique a mené à la détection de signaux dans l'une des quatre combinaisons médicament-EI. Les principales contributions sont: a) Le premier outil de détection de signaux à partir des banques de données administratives canadiennes; b) Contributions méthodologiques par la prise en compte de l'effet de déplétion des sujets à risque et le contrôle pour l'état de santé du patient. La revue a identifié 119 IMRs dans la littérature et 1,112 IMRs dans les sites web des agences réglementaires. La revue a démontré qu’il existe une augmentation des IMRs depuis l’introduction des guides réglementaires en 2005 mais leur efficacité demeure peu démontrée.
Resumo:
La croissance dramatique du commerce électronique des titres cache un grand potentiel pour les investisseurs, de même que pour l’industrie des valeurs mobilières en général. Prenant en considération ses risques particuliers, les autorités réglementaires vivent un défi important face à l’Internet en tant que nouveau moyen d’investir. Néanmoins, malgré l’évolution technologique, les objectifs fondamentaux et l’approche des autorités réglementaires restent similaires à ce qui se produit présentement. Cet article analyse l’impact de l’Internet sur le commerce des valeurs mobilières en se concentrant sur les problèmes soulevés par l’utilisation de ce nouveau moyen de communication dans le contexte du marché secondaire. Par conséquent, son objectif est de dresser le portrait des plaintes typiques des investisseurs, de même que celui des activités frauduleuses en valeurs mobilières propres au cyberespace. L’auteur fait une synthèse des développements récents en analysant l’approche des autorités réglementaires, les études doctrinales, la jurisprudence et les cas administratifs. L'auteure désire remercier la professeure Raymonde Crête pour ses précieux commentaires et conseils.
Resumo:
This research project is a contribution to the global field of information retrieval, specifically, to develop tools to enable information access in digital documents. We recognize the need to provide the user with flexible access to the contents of large, potentially complex digital documents, with means other than a search function or a handful of metadata elements. The goal is to produce a text browsing tool offering a maximum of information based on a fairly superficial linguistic analysis. We are concerned with a type of extensive single-document indexing, and not indexing by a set of keywords (see Klement, 2002, for a clear distinction between the two). The desired browsing tool would not only give at a glance the main topics discussed in the document, but would also present relationships between these topics. It would also give direct access to the text (via hypertext links to specific passages). The present paper, after reviewing previous research on this and similar topics, discusses the methodology and the main characteristics of a prototype we have devised. Experimental results are presented, as well as an analysis of remaining hurdles and potential applications.
Resumo:
Objective To determine overall, test–retest and inter-rater reliability of posture indices among persons with idiopathic scoliosis. Design A reliability study using two raters and two test sessions. Setting Tertiary care paediatric centre. Participants Seventy participants aged between 10 and 20 years with different types of idiopathic scoliosis (Cobb angle 15 to 60°) were recruited from the scoliosis clinic. Main outcome measures Based on the XY co-ordinates of natural reference points (e.g. eyes) as well as markers placed on several anatomical landmarks, 32 angular and linear posture indices taken from digital photographs in the standing position were calculated from a specially developed software program. Generalisability theory served to estimate the reliability and standard error of measurement (SEM) for the overall, test–retest and inter-rater designs. Bland and Altman's method was also used to document agreement between sessions and raters. Results In the random design, dependability coefficients demonstrated a moderate level of reliability for six posture indices (ϕ = 0.51 to 0.72) and a good level of reliability for 26 posture indices out of 32 (ϕ ≥ 0.79). Error attributable to marker placement was negligible for most indices. Limits of agreement and SEM values were larger for shoulder protraction, trunk list, Q angle, cervical lordosis and scoliosis angles. The most reproducible indices were waist angles and knee valgus and varus. Conclusions Posture can be assessed in a global fashion from photographs in persons with idiopathic scoliosis. Despite the good reliability of marker placement, other studies are needed to minimise measurement errors in order to provide a suitable tool for monitoring change in posture over time.
Resumo:
STUDY DESIGN: Concurrent validity between postural indices obtained from digital photographs (two-dimensional [2D]), surface topography imaging (three-dimensional [3D]), and radiographs. OBJECTIVE: To assess the validity of a quantitative clinical postural assessment tool of the trunk based on photographs (2D) as compared to a surface topography system (3D) as well as indices calculated from radiographs. SUMMARY OF BACKGROUND DATA: To monitor progression of scoliosis or change in posture over time in young persons with idiopathic scoliosis (IS), noninvasive and nonionizing methods are recommended. In a clinical setting, posture can be quite easily assessed by calculating key postural indices from photographs. METHODS: Quantitative postural indices of 70 subjects aged 10 to 20 years old with IS (Cobb angle, 15 degrees -60 degrees) were measured from photographs and from 3D trunk surface images taken in the standing position. Shoulder, scapula, trunk list, pelvis, scoliosis, and waist angles indices were calculated with specially designed software. Frontal and sagittal Cobb angles and trunk list were also calculated on radiographs. The Pearson correlation coefficients (r) was used to estimate concurrent validity of the 2D clinical postural tool of the trunk with indices extracted from the 3D system and with those obtained from radiographs. RESULTS: The correlation between 2D and 3D indices was good to excellent for shoulder, pelvis, trunk list, and thoracic scoliosis (0.81>r<0.97; P<0.01) but fair to moderate for thoracic kyphosis, lumbar lordosis, and thoracolumbar or lumbar scoliosis (0.30>r<0.56; P<0.05). The correlation between 2D and radiograph spinal indices was fair to good (-0.33 to -0.80 with Cobb angles and 0.76 for trunk list; P<0.05). CONCLUSION: This tool will facilitate clinical practice by monitoring trunk posture among persons with IS. Further, it may contribute to a reduction in the use of radiographs to monitor scoliosis progression.
Resumo:
In this thesis, the applications of the recurrence quantification analysis in metal cutting operation in a lathe, with specific objective to detect tool wear and chatter, are presented.This study is based on the discovery that process dynamics in a lathe is low dimensional chaotic. It implies that the machine dynamics is controllable using principles of chaos theory. This understanding is to revolutionize the feature extraction methodologies used in condition monitoring systems as conventional linear methods or models are incapable of capturing the critical and strange behaviors associated with the metal cutting process.As sensor based approaches provide an automated and cost effective way to monitor and control, an efficient feature extraction methodology based on nonlinear time series analysis is much more demanding. The task here is more complex when the information has to be deduced solely from sensor signals since traditional methods do not address the issue of how to treat noise present in real-world processes and its non-stationarity. In an effort to get over these two issues to the maximum possible, this thesis adopts the recurrence quantification analysis methodology in the study since this feature extraction technique is found to be robust against noise and stationarity in the signals.The work consists of two different sets of experiments in a lathe; set-I and set-2. The experiment, set-I, study the influence of tool wear on the RQA variables whereas the set-2 is carried out to identify the sensitive RQA variables to machine tool chatter followed by its validation in actual cutting. To obtain the bounds of the spectrum of the significant RQA variable values, in set-i, a fresh tool and a worn tool are used for cutting. The first part of the set-2 experiments uses a stepped shaft in order to create chatter at a known location. And the second part uses a conical section having a uniform taper along the axis for creating chatter to onset at some distance from the smaller end by gradually increasing the depth of cut while keeping the spindle speed and feed rate constant.The study concludes by revealing the dependence of certain RQA variables; percent determinism, percent recurrence and entropy, to tool wear and chatter unambiguously. The performances of the results establish this methodology to be viable for detection of tool wear and chatter in metal cutting operation in a lathe. The key reason is that the dynamics of the system under study have been nonlinear and the recurrence quantification analysis can characterize them adequately.This work establishes that principles and practice of machining can be considerably benefited and advanced from using nonlinear dynamics and chaos theory.
Resumo:
Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.This dissertation contributes to an architecture oriented code validation, error localization and optimization technique assisting the embedded system designer in software debugging, to make it more effective at early detection of software bugs that are otherwise hard to detect, using the static analysis of machine codes. The focus of this work is to develop methods that automatically localize faults as well as optimize the code and thus improve the debugging process as well as quality of the code.Validation is done with the help of rules of inferences formulated for the target processor. The rules govern the occurrence of illegitimate/out of place instructions and code sequences for executing the computational and integrated peripheral functions. The stipulated rules are encoded in propositional logic formulae and their compliance is tested individually in all possible execution paths of the application programs. An incorrect sequence of machine code pattern is identified using slicing techniques on the control flow graph generated from the machine code.An algorithm to assist the compiler to eliminate the redundant bank switching codes and decide on optimum data allocation to banked memory resulting in minimum number of bank switching codes in embedded system software is proposed. A relation matrix and a state transition diagram formed for the active memory bank state transition corresponding to each bank selection instruction is used for the detection of redundant codes. Instances of code redundancy based on the stipulated rules for the target processor are identified.This validation and optimization tool can be integrated to the system development environment. It is a novel approach independent of compiler/assembler, applicable to a wide range of processors once appropriate rules are formulated. Program states are identified mainly with machine code pattern, which drastically reduces the state space creation contributing to an improved state-of-the-art model checking. Though the technique described is general, the implementation is architecture oriented, and hence the feasibility study is conducted on PIC16F87X microcontrollers. The proposed tool will be very useful in steering novices towards correct use of difficult microcontroller features in developing embedded systems.
Resumo:
Learning Disability (LD) is a neurological condition that affects a child’s brain and impairs his ability to carry out one or many specific tasks. LD affects about 15 % of children enrolled in schools. The prediction of LD is a vital and intricate job. The aim of this paper is to design an effective and powerful tool, using the two intelligent methods viz., Artificial Neural Network and Adaptive Neuro-Fuzzy Inference System, for measuring the percentage of LD that affected in school-age children. In this study, we are proposing some soft computing methods in data preprocessing for improving the accuracy of the tool as well as the classifier. The data preprocessing is performed through Principal Component Analysis for attribute reduction and closest fit algorithm is used for imputing missing values. The main idea in developing the LD prediction tool is not only to predict the LD present in children but also to measure its percentage along with its class like low or minor or major. The system is implemented in Mathworks Software MatLab 7.10. The results obtained from this study have illustrated that the designed prediction system or tool is capable of measuring the LD effectively