974 resultados para High-Order Accurate Scheme
Resumo:
This pilot study developed a climate instrument which was administered in a sample of high schools in one board of education. Several tests were conducted i n order to determine the reliability and internal consistency of the instrument . The ability of the instrument to identify the demographic differences of school and gender was also tested. The relationship between leadership styles and an effective use of authority in creating a productive and rewarding work environment was the f ocus of t his study. Attitudes to leadership and perceived school morale were investigated in a demographic study, a climate survey, as well as a body of related literature. In light of the empirical research, an attempt was made to determine the extent to which the authority figure's behaviour and adopted leadership style contributed to a positive school climate : one in which t eachers were motivated to achieve to t he best of their abilities by way of their commitment and service. The tone of authority assumed by t he leader not only shapes the mood of the school environment but ultimately determines the efficiency and morale of t he teaching staff.
Resumo:
Traditional psychometric theory and practice classify people according to broad ability dimensions but do not examine how these mental processes occur. Hunt and Lansman (1975) proposed a 'distributed memory' model of cognitive processes with emphasis on how to describe individual differences based on the assumption that each individual possesses the same components. It is in the quality of these components ~hat individual differences arise. Carroll (1974) expands Hunt's model to include a production system (after Newell and Simon, 1973) and a response system. He developed a framework of factor analytic (FA) factors for : the purpose of describing how individual differences may arise from them. This scheme is to be used in the analysis of psychometric tes ts . Recent advances in the field of information processing are examined and include. 1) Hunt's development of differences between subjects designated as high or low verbal , 2) Miller's pursuit of the magic number seven, plus or minus two, 3) Ferguson's examination of transfer and abilities and, 4) Brown's discoveries concerning strategy teaching and retardates . In order to examine possible sources of individual differences arising from cognitive tasks, traditional psychometric tests were searched for a suitable perceptual task which could be varied slightly and administered to gauge learning effects produced by controlling independent variables. It also had to be suitable for analysis using Carroll's f ramework . The Coding Task (a symbol substitution test) found i n the Performance Scale of the WISe was chosen. Two experiments were devised to test the following hypotheses. 1) High verbals should be able to complete significantly more items on the Symbol Substitution Task than low verbals (Hunt, Lansman, 1975). 2) Having previous practice on a task, where strategies involved in the task may be identified, increases the amount of output on a similar task (Carroll, 1974). J) There should be a sUbstantial decrease in the amount of output as the load on STM is increased (Miller, 1956) . 4) Repeated measures should produce an increase in output over trials and where individual differences in previously acquired abilities are involved, these should differentiate individuals over trials (Ferguson, 1956). S) Teaching slow learners a rehearsal strategy would improve their learning such that their learning would resemble that of normals on the ,:same task. (Brown, 1974). In the first experiment 60 subjects were d.ivided·into high and low verbal, further divided randomly into a practice group and nonpractice group. Five subjects in each group were assigned randomly to work on a five, seven and nine digit code throughout the experiment. The practice group was given three trials of two minutes each on the practice code (designed to eliminate transfer effects due to symbol similarity) and then three trials of two minutes each on the actual SST task . The nonpractice group was given three trials of two minutes each on the same actual SST task . Results were analyzed using a four-way analysis of variance . In the second experiment 18 slow learners were divided randomly into two groups. one group receiving a planned strategy practioe, the other receiving random practice. Both groups worked on the actual code to be used later in the actual task. Within each group subjects were randomly assigned to work on a five, seven or nine digit code throughout. Both practice and actual tests consisted on three trials of two minutes each. Results were analyzed using a three-way analysis of variance . It was found in t he first experiment that 1) high or low verbal ability by itself did not produce significantly different results. However, when in interaction with the other independent variables, a difference in performance was noted . 2) The previous practice variable was significant over all segments of the experiment. Those who received previo.us practice were able to score significantly higher than those without it. J) Increasing the size of the load on STM severely restricts performance. 4) The effect of repeated trials proved to be beneficial. Generally, gains were made on each successive trial within each group. S) In the second experiment, slow learners who were allowed to practice randomly performed better on the actual task than subjeots who were taught the code by means of a planned strategy. Upon analysis using the Carroll scheme, individual differences were noted in the ability to develop strategies of storing, searching and retrieving items from STM, and in adopting necessary rehearsals for retention in STM. While these strategies may benef it some it was found that for others they may be harmful . Temporal aspects and perceptual speed were also found to be sources of variance within individuals . Generally it was found that the largest single factor i nfluencing learning on this task was the repeated measures . What e~ables gains to be made, varies with individuals . There are environmental factors, specific abilities, strategy development, previous learning, amount of load on STM , perceptual and temporal parameters which influence learning and these have serious implications for educational programs .
Resumo:
Hepatocellular Carcinoma (HCC) is a major healthcare problem, representing the third most common cause of cancer-related mortality worldwide. Chronic infections with Hepatitis B virus (HBV) and/or Hepatitis C virus (HCV) are the major risk factors for the development of HCC. The incidence of HBV -associated HCC is in decline as a result of an effective HBV vaccine; however, since an equally effective HCV vaccine has not yet been developed, there are 130 million HCV infected patients worldwide who are at a high-risk for developing HCC. Because reliable parameters and/or tools for the early detection of HCC among high-risk individuals are severely lacking, HCC patients are always diagnosed at a late stage where surgical solutions or effective treatment are not possible. Using urine as a non-invasive sample source, two different approaches (proteomic-based and genomic-based approaches) were pursued with the common goal of discovering potential biomarker candidates for the early detection of HCC among high-risk chronic HCV infected patients. Urine was collected from 106 HCV infected Egyptian patients, 32 of whom had already developed HCC and 74 patients who were diagnosed as HCC-free at the time of initial sample collection. In addition to these patients, urine samples were also collected from 12 healthy control individuals. Total urinary proteins, Trans-renal nucleic acid (Tr-NA) and microRNA (miRNA) were isolated from urine using novel methodologies and silicon carbide-loaded spin columns. In the first, "proteomic-based", approach, liquid chromatography coupled with tandem mass spectrometry (LC-MS/MS) was used to identify potential candidates from pooled urine samples. This was followed by validating relative expression levels of proteins present in urine among all the patients using quantitative real time-PCR (qRT-PCR). This approach revealed that significant over-expression of three proteins: DJ-1, Chromatin Assembly Factor-1 (CAF-1) and 11 Moemen Abdalla HCC Biomarkers Heat Shock Protein 60 (HSP60), were characteristic events among HCC-post HCV infected patients. As a single-based HCC biomarker, CAF-1 over-expression identified HCC among HCV infected patients with a specificity of 90%, sensitivity of 66% and with an overall diagnostic accuracy of 78%. Moreover, the CAF-lIHSP60 tandem identified HCC among HCV infected patients with a specificity of 92%, sensitivity of 61 % and with an overall diagnostic accuracy of 77%. In the second genomic-based approach, two different approaches were processed. The first approach was the miRNA-based approach. The expression levels of miRNAs isolated from urine were studied using the Illumina MicroRNA Expression Profiling Assay. This was followed by qRT-PCR-based validation of deregulated expression of identified miRNA candidates among all the patients. This approach shed the light on the deregulated expression of a number of miRNAs, which may have a role in either the development of HCC among HCV infected patients (i.e. miR-640, miR-765, miR-200a, miR-521 and miR-520) or may allow for a better understanding of the viral-host interaction (miR-152, miR-486, miR-219, miR452, miR-425, miR-154 and miR-31). Moreover, the deregulated expression of both miR-618 and miR-650 appeared to be a common event among HCC-post HCV infected patients. The results of the search for putative targets of these two miRNA suggested that miR-618 may be a potent oncogene, as it targets the tumor-suppressor gene Low density lipoprotein-related protein 12 (LPR12), while miR-650 may be a potent tumor-suppressor gene, as it is supposed to downregulate the TNF receptor-associated factor-4 (TRAF4) oncogene. The specificity of miR-618 and miR-650 deregulated expression patterns for the early detection of HCC among HCV infected patients was 68% and 58%, respectively, whereas the sensitivity was 64% and 72%, respectively. When the deregulated expression of both miRNAs was combined as a tandem biomarker, the specificity and the sensitivity were 75% and 58% respectively. 111 Moemen Abdalla HCC Biomarkers In the second, "Trans-renal nucleic acid-based", approach, the urinary apoptotic nucleic acid (uaNA) levels of 70ng/mL or more were found to be a good predictor of HCC among chronic HCV infected patients. The specificity and the sensitivity of this diagnostic approach were 76% and 86%, respectively, with an overall diagnostic value of 81 %. The uaNA levels positively correlated to HCC disease progression as monitored by epigenetic changes of a panel of eight tumor-suppressor genes (TSGs) using methylation-sensitive PCR. Moreover, the pairing of high uaNA levels (:::: 70 ng/mL) and CAF-1 over-expreSSIOn produced a highly specific (l 00%) multiple-based HCC biomarker with an acceptable sensitivity of 64%, and with a diagnostic accuracy of 82%. In comparison to the previous pairing, the uaNA levels (:::: 70 ng/mL) in tandem with HSP60 over-expression was less specific (89%) but highly sensitive (72%), resulting in a diagnostic accuracy of 64%. The specificities of miR-650 deregulated expression in combination with either high uaNA content or HSP 60 over-expression were 82% and 79%, respectively, whereas, the sensitivities of these combinations were 64% and 58%, respectively. The potential biomarkers identified in this study compare favorably with the diagnostic accuracy of the a-fetoprotein levels test, which has a specificity of 75%, sensitivity of 68% and an overall diagnostic accuracy of 70%. Here we present an intriguing study which shows the significance of using urine as a noninvasive sample source for the identification of promising HCC biomarkers. We have also introduced new techniques for the isolation of different urinary macromolecules, especially miRNA, from urine. Furthermore, we strongly recommend the potential biomarkers indentified in this study as focal points of any future research on HCC diagnosis. A larger testing pool will determine if their use is practical for mass population screening. This explorative study identified potential targets that merit further investigation for the development of diagnostically accurate biomarkers isolated from 1-2 mL urine samples that were acquired in a non-invasive manner.
Resumo:
Dynamic logic is an extension of modal logic originally intended for reasoning about computer programs. The method of proving correctness of properties of a computer program using the well-known Hoare Logic can be implemented by utilizing the robustness of dynamic logic. For a very broad range of languages and applications in program veri cation, a theorem prover named KIV (Karlsruhe Interactive Veri er) Theorem Prover has already been developed. But a high degree of automation and its complexity make it di cult to use it for educational purposes. My research work is motivated towards the design and implementation of a similar interactive theorem prover with educational use as its main design criteria. As the key purpose of this system is to serve as an educational tool, it is a self-explanatory system that explains every step of creating a derivation, i.e., proving a theorem. This deductive system is implemented in the platform-independent programming language Java. In addition, a very popular combination of a lexical analyzer generator, JFlex, and the parser generator BYacc/J for parsing formulas and programs has been used.
Resumo:
Globally, Prostate cancer (PCa) is the most frequently occurring non-cutaneous cancer, and is the second highest cause of cancer mortality in men. Serum prostate specific antigen (PSA) has been the standard in PCa screening since its approval by the American Food & Drug Administration (FDA) in 1994. Currently, PSA is used as an indicator for PCa - patients with a serum PSA level above 4ng/mL will often undergo prostate biopsy to confirm cancer. Unfortunately fewer than similar to 30% of these men will biopsy positive for cancer, meaning that the majority of men undergo invasive biopsy with little benefit. Despite PSA's notoriously poor specificity (33%), there is still a significant lack of credible alternatives. Therefore an ideal biomarker that can specifically detect PCa at an early stage is urgently required. The aim of this study was to investigate the potential of using deregulation of urinary proteins in order to detect Prostate Cancer (PCa) among Benign Prostatic Hyperplasia (BPH). To identify the protein signatures specific for PCa, protein expression profiling of 8 PCa patients, 12 BPH patients and 10 healthy males was carried out using LC-MS/MS. This was followed by validating relative expression levels of proteins present in urine among all the patients using quantitative real time-PCR. This was followed by validating relative expression levels of proteins present in urine among all the patients using quantitative real time-PCR. This approach revealed that significant the down-regulation of Fibronectin and TP53INP2 was a characteristic event among PCa patients. Fibronectin mRNA down-regulation, was identified as offering improved specificity (50%) over PSA, albeit with a slightly lower although still acceptable sensitivity (75%) for detecting PCa. As for TP53INP2 on the other hand, its down-regulation was moderately sensitive (75%), identifying many patients with PCa, but was entirely non-specific (7%), designating many of the benign samples as malignant and being unable to accurately identify more than one negative.
Resumo:
MicroRNAs (miRNAs) are a class of short (similar to 22nt), single stranded RNA molecules that function as post-transcriptional regulators of gene expression. MiRNAs can regulate a variety of important biological pathways, including: cellular proliferation, differentiation and apoptosis. Profiling of miRNA expression patterns was shown to be more useful than the equivalent mRNA profiles for characterizing poorly differentiated tumours. As such, miRNA expression "signatures" are expected to offer serious potential for diagnosing and prognosing cancers of any provenance. The aim of this study was to investigate the potential of using deregulation of urinary miRNAs in order to detect Prostate Cancer (PCa) among Benign Prostatic Hyperplasia (BPH). To identify the miRNA signatures specific for PCa, miRNA expression profiling of 8 PCa patients, 12 BPH patients and 10 healthy males was carried out using whole genome expression profiling. Differential expression of two individual miRNAs between healthy males and BPH patients was detected and found to possibly target genes related to PCa development and progression. The sensitivity and specificity of miR-1825 for detecting PCa among BPH individuals was found to be 60% and 69%, respectively. Whereas, the sensitivity and specificity of miR-484 were 80% and 19%, respectively. Additionally, the sensitivity and specificity for miR-1825/484 in tandem were 45% and 75%, respectively. The proposed PCa miRNA signatures may therefore be of great value for the accurate diagnosis of PCa and BPH. This exploratory study has identified several possible targets that merit further investigation towards the development and validation of diagnostically useful, non-invasive, urine-based tests that might not only help diagnose PCa but also possibly help differentiate it from BPH.
Resumo:
The KCube interconnection network was first introduced in 2010 in order to exploit the good characteristics of two well-known interconnection networks, the hypercube and the Kautz graph. KCube links up multiple processors in a communication network with high density for a fixed degree. Since the KCube network is newly proposed, much study is required to demonstrate its potential properties and algorithms that can be designed to solve parallel computation problems. In this thesis we introduce a new methodology to construct the KCube graph. Also, with regard to this new approach, we will prove its Hamiltonicity in the general KC(m; k). Moreover, we will find its connectivity followed by an optimal broadcasting scheme in which a source node containing a message is to communicate it with all other processors. In addition to KCube networks, we have studied a version of the routing problem in the traditional hypercube, investigating this problem: whether there exists a shortest path in a Qn between two nodes 0n and 1n, when the network is experiencing failed components. We first conditionally discuss this problem when there is a constraint on the number of faulty nodes, and subsequently introduce an algorithm to tackle the problem without restrictions on the number of nodes.
Resumo:
The employment of the bridging/chelating Schiff bases, N-salicylidene-4-methyl-o-aminophenol (samphH2) and N-naphthalidene-2-amino-5-chlorobenzoic acid (nacbH2), in nickel cluster chemistry has afforded eight polynuclear Ni(II) complexes with new structural motifs, interesting magnetic and optical properties, and unexpected organic ligand transformations. In the present thesis, Chapter 1 deals with all the fundamental aspects of polynuclear metal complexes, molecular magnetism and optics, while research results are reported in Chapters 2 and 3. In the first project (Chapter 2), I investigated the coordination chemistry of the organic chelating/bridging ligand, N-salicylidene-4-methyl-o-aminophenol (samphH2). The general NiII/tBuCO2-/samphH2 reaction system afforded two new tetranuclear NiII clusters, namely [Ni4(samph)4(EtOH)4] (1) and [Ni4(samph)4(DMF)2] (2), with different structural motifs. Complex 1 possessed a cubane core while in complex 2 the four NiII ions were located at the four vertices of a defective dicubane. The nature of the organic solvent was found to be of pivotal importance, leading to compounds with the same nuclearity, but different structural topologies and magnetic properties. The second project, the results of which are summarized in Chapter 3, included the systematic study of a new optically-active Schiff base ligand, N-naphthalidene-2-amino-5-chlorobenzoic acid (nacbH2), in NiII cluster chemistry. Various reactions between NiX2 (X- = inorganic anions) and nacbH2 were performed under basic conditions to yield six new polynuclear NiII complexes, namely (NHEt3)[Ni12(nacb)12(H2O)4](ClO4) (3), (NHEt3)2[Ni5(nacb)4(L)(LH)2(MeOH)] (4), [Ni5(OH)2(nacb)4(DMF)4] (5), [Ni5(OMe)Cl(nacb)4(MeOH)3(MeCN)] (6), (NHEt3)2[Ni6(OH)2(nacb)6(H2O)4] (7), and [Ni6(nacb)6(H2O)3(MeOH)6] (8). The nature of the solvent, the inorganic anion, X-, and the organic base were all found to be of critical importance, leading to products with different structural topologies and nuclearities (i.e., {Ni5}, {Ni6} and {Ni12}). Magnetic studies on all synthesized complexes revealed an overall ferromagnetic behavior for complexes 4 and 8, with the remaining complexes being dominated by antiferromagnetic exchange interactions. In order to assess the optical efficiency of the organic ligand when bound to the metal centers, photoluminescence studies were performed on all synthesized compounds. Complexes 4 and 5 show strong emission in the visible region of the electromagnetic spectrum. Finally, the ligand nacbH2 allowed for some unexpected organic transformations to occur; for instance, the pentanuclear compound 5 comprises both nacb2- groups and a new organic chelate, namely the anion of 5-chloro-2-[(3-hydroxy-4-oxo-1,4-dihydronaphthalen-1-yl)amino]benzoic acid. In the last section of this thesis, an attempt to compare the NiII cluster chemistry of the N-naphthalidene-2-amino-5-chlorobenzoic acid ligand with that of the structurally similar but less bulky, N-salicylidene-2-amino-5-chlorobenzoic acid (sacbH2), was made.
Resumo:
This note develops general model-free adjustment procedures for the calculation of unbiased volatility loss functions based on practically feasible realized volatility benchmarks. The procedures, which exploit the recent asymptotic distributional results in Barndorff-Nielsen and Shephard (2002a), are both easy to implement and highly accurate in empirically realistic situations. On properly accounting for the measurement errors in the volatility forecast evaluations reported in Andersen, Bollerslev, Diebold and Labys (2003), the adjustments result in markedly higher estimates for the true degree of return-volatility predictability.
Resumo:
Affiliation: Département de Biochimie, Université de Montréal
Resumo:
Les communautés inuites de la Baie d’Hudson au Nunavik (Québec) se distinguent des autres communautés autochtones par leur réappropriation des naissances depuis 1986 et par la création d’un programme de formation de sages-femmes locales. Cela a permis de mettre un terme à une longue période de transfert des femmes pour accouchement en structure hospitalière, à des kilomètres de leur village. De plus, ce programme a pour objectif de réintégrer les pratiques traditionnelles au sein d’une obstétrique moderne afin d’offrir aux femmes des services de qualité et culturellement appropriés. Le but de notre étude était d’établir si le programme de formation de sages-femmes autochtones du Nunavik a permis de concilier ces deux approches d’enseignement différentes : l’une axée sur le savoir traditionnel, et l’autre concernant les normes de qualité de soins à respecter. Une méthode de recherche qualitative a été adoptée et les données ont été recueillies à l’aide d’entrevues réalisées auprès de cinq sages-femmes inuites et de six étudiantes sages-femmes du programme de formation du Nunavik, au sein des trois villages de la Baie d’Hudson pourvus de centre de naissances. L’analyse qualitative des données ne permet pas de confirmer la réintégration du savoir traditionnel dans la pratique des sages-femmes autochtones. Les résultats révèlent, en effet, une rareté des pratiques traditionnelles connues et/ou utilisées par celles-ci (notamment l’utilisation de plantes ou de remèdes médicinaux, les postures d’accouchement, les manœuvres obstétricales, etc) en relation avec la période périnatale. Les croyances ou codes de conduite à respecter pendant la grossesse semblent bénéficier d’une meilleure transmission, mais ne font plus l’unanimité au sein des communautés. Concernant le volet de l’obstétrique moderne, le programme de formation semble conforme aux exigences actuelles occidentales, étant reconnu par l’Ordre des sages-femmes du Québec depuis septembre 2008. De plus, les sages-femmes et les étudiantes sont conscientes de la nécessité de recevoir une formation de qualité. Elles aimeraient bénéficier d’une plus grande rigueur dans l’enseignement théorique ainsi que d’une meilleure continuité du processus d’apprentissage. La difficulté retrouvée dans la mixité de l’enseignement de ces deux savoirs (traditionnel et moderne) semble donc être liée plus particulièrement au savoir traditionnel. Les sages-femmes et étudiantes inuites souhaitent protéger et promouvoir leur patrimoine culturel, mais plus dans une optique de responsabilité communautaire que dans le cadre d’un programme de formation. Une collaboration entre les volontés des communautés concernant la réintégration de ce patrimoine et la réalité actuelle de la biomédecine demeure primordiale pour continuer à garantir la sécurité et la qualité des services dispensés.
Resumo:
Cette thèse porte sur l’amélioration des techniques d’imagerie à haut-contraste permettant la détection directe de compagnons à de faibles séparations de leur étoile hôte. Plus précisément, elle s’inscrit dans le développement du Gemini Planet Imager (GPI) qui est un instrument de deuxième génération pour les télescopes Gemini. Cette caméra utilisera un spectromètre à champ intégral (SCI) pour caractériser les compagnons détectés et pour réduire le bruit de tavelure limitant leur détection et corrigera la turbulence atmosphérique à un niveau encore jamais atteint en utilisant deux miroirs déformables dans son système d’optique adaptative (OA) : le woofer et le tweeter. Le woofer corrigera les aberrations de basses fréquences spatiales et de grandes amplitudes alors que le tweeter compensera les aberrations de plus hautes fréquences ayant une plus faible amplitude. Dans un premier temps, les performances pouvant être atteintes à l’aide des SCIs présentement en fonction sur les télescopes de 8-10 m sont investiguées en observant le compagnon de l’étoile GQ Lup à l’aide du SCI NIFS et du système OA ALTAIR installés sur le télescope Gemini Nord. La technique de l’imagerie différentielle angulaire (IDA) est utilisée pour atténuer le bruit de tavelure d’un facteur 2 à 6. Les spectres obtenus en bandes JHK ont été utilisés pour contraindre la masse du compagnon par comparaison avec les prédictions des modèles atmosphériques et évolutifs à 8−60 MJup, où MJup représente la masse de Jupiter. Ainsi, il est déterminé qu’il s’agit plus probablement d’une naine brune que d’une planète. Comme les SCIs présentement en fonction sont des caméras polyvalentes pouvant être utilisées pour plusieurs domaines de l’astrophysique, leur conception n’a pas été optimisée pour l’imagerie à haut-contraste. Ainsi, la deuxième étape de cette thèse a consisté à concevoir et tester en laboratoire un prototype de SCI optimisé pour cette tâche. Quatre algorithmes de suppression du bruit de tavelure ont été testés sur les données obtenues : la simple différence, la double différence, la déconvolution spectrale ainsi qu’un nouvel algorithme développé au sein de cette thèse baptisé l’algorithme des spectres jumeaux. Nous trouvons que l’algorithme des spectres jumeaux est le plus performant pour les deux types de compagnons testés : les compagnons méthaniques et non-méthaniques. Le rapport signal-sur-bruit de la détection a été amélioré d’un facteur allant jusqu’à 14 pour un compagnon méthanique et d’un facteur 2 pour un compagnon non-méthanique. Dernièrement, nous nous intéressons à certains problèmes liés à la séparation de la commande entre deux miroirs déformables dans le système OA de GPI. Nous présentons tout d’abord une méthode utilisant des calculs analytiques et des simulations Monte Carlo pour déterminer les paramètres clés du woofer tels que son diamètre, son nombre d’éléments actifs et leur course qui ont ensuite eu des répercussions sur le design général de l’instrument. Ensuite, le système étudié utilisant un reconstructeur de Fourier, nous proposons de séparer la commande entre les deux miroirs dans l’espace de Fourier et de limiter les modes transférés au woofer à ceux qu’il peut précisément reproduire. Dans le contexte de GPI, ceci permet de remplacer deux matrices de 1600×69 éléments nécessaires pour une séparation “classique” de la commande par une seule de 45×69 composantes et ainsi d’utiliser un processeur prêt à être utilisé plutôt qu’une architecture informatique plus complexe.
Resumo:
Ce mémoire vise à recenser les avantages et les inconvénients de l'utilisation du langage de programmation fonctionnel dynamique Scheme pour le développement de jeux vidéo. Pour ce faire, la méthode utilisée est d'abord basée sur une approche plus théorique. En effet, une étude des besoins au niveau de la programmation exprimés par ce type de développement, ainsi qu'une description détaillant les fonctionnalités du langage Scheme pertinentes au développement de jeux vidéo sont données afin de bien mettre en contexte le sujet. Par la suite, une approche pratique est utilisée en effectuant le développement de deux jeux vidéo de complexités croissantes: Space Invaders et Lode Runner. Le développement de ces jeux vidéo a mené à l'extension du langage Scheme par plusieurs langages spécifiques au domaine et bibliothèques, dont notamment un système de programmation orienté objets et un système de coroutines. L'expérience acquise par le développement de ces jeux est finalement comparée à celle d'autres développeurs de jeux vidéo de l'industrie qui ont utilisé Scheme pour la création de titres commerciaux. En résumé, l'utilisation de ce langage a permis d'atteindre un haut niveau d'abstraction favorisant la modularité des jeux développés sans affecter les performances de ces derniers.
Resumo:
Depuis quelques années, la recherche dans le domaine des réseaux maillés sans fil ("Wireless Mesh Network (WMN)" en anglais) suscite un grand intérêt auprès de la communauté des chercheurs en télécommunications. Ceci est dû aux nombreux avantages que la technologie WMN offre, telles que l'installation facile et peu coûteuse, la connectivité fiable et l'interopérabilité flexible avec d'autres réseaux existants (réseaux Wi-Fi, réseaux WiMax, réseaux cellulaires, réseaux de capteurs, etc.). Cependant, plusieurs problèmes restent encore à résoudre comme le passage à l'échelle, la sécurité, la qualité de service (QdS), la gestion des ressources, etc. Ces problèmes persistent pour les WMNs, d'autant plus que le nombre des utilisateurs va en se multipliant. Il faut donc penser à améliorer les protocoles existants ou à en concevoir de nouveaux. L'objectif de notre recherche est de résoudre certaines des limitations rencontrées à l'heure actuelle dans les WMNs et d'améliorer la QdS des applications multimédia temps-réel (par exemple, la voix). Le travail de recherche de cette thèse sera divisé essentiellement en trois principaux volets: le contrôle d‟admission du trafic, la différentiation du trafic et la réaffectation adaptative des canaux lors de la présence du trafic en relève ("handoff" en anglais). Dans le premier volet, nous proposons un mécanisme distribué de contrôle d'admission se basant sur le concept des cliques (une clique correspond à un sous-ensemble de liens logiques qui interfèrent les uns avec les autres) dans un réseau à multiples-sauts, multiples-radios et multiples-canaux, appelé RCAC. Nous proposons en particulier un modèle analytique qui calcule le ratio approprié d'admission du trafic et qui garantit une probabilité de perte de paquets dans le réseau n'excédant pas un seuil prédéfini. Le mécanisme RCAC permet d‟assurer la QdS requise pour les flux entrants, sans dégrader la QdS des flux existants. Il permet aussi d‟assurer la QdS en termes de longueur du délai de bout en bout pour les divers flux. Le deuxième volet traite de la différentiation de services dans le protocole IEEE 802.11s afin de permettre une meilleure QdS, notamment pour les applications avec des contraintes temporelles (par exemple, voix, visioconférence). À cet égard, nous proposons un mécanisme d'ajustement de tranches de temps ("time-slots"), selon la classe de service, ED-MDA (Enhanced Differentiated-Mesh Deterministic Access), combiné à un algorithme efficace de contrôle d'admission EAC (Efficient Admission Control), afin de permettre une utilisation élevée et efficace des ressources. Le mécanisme EAC prend en compte le trafic en relève et lui attribue une priorité supérieure par rapport au nouveau trafic pour minimiser les interruptions de communications en cours. Dans le troisième volet, nous nous intéressons à minimiser le surcoût et le délai de re-routage des utilisateurs mobiles et/ou des applications multimédia en réaffectant les canaux dans les WMNs à Multiples-Radios (MR-WMNs). En premier lieu, nous proposons un modèle d'optimisation qui maximise le débit, améliore l'équité entre utilisateurs et minimise le surcoût dû à la relève des appels. Ce modèle a été résolu par le logiciel CPLEX pour un nombre limité de noeuds. En second lieu, nous élaborons des heuristiques/méta-heuristiques centralisées pour permettre de résoudre ce modèle pour des réseaux de taille réelle. Finalement, nous proposons un algorithme pour réaffecter en temps-réel et de façon prudente les canaux aux interfaces. Cet algorithme a pour objectif de minimiser le surcoût et le délai du re-routage spécialement du trafic dynamique généré par les appels en relève. Ensuite, ce mécanisme est amélioré en prenant en compte l‟équilibrage de la charge entre cliques.
Resumo:
Alan Garcia, l’actuel président du Pérou, est un des politiciens les plus controversés dans l’histoire péruvienne. Le succès de sa carrière comme candidat est fort opposé aux résultats catastrophiques de sa première gestion présidentielle. Dans la culture populaire, les compétences discursives de Garcia, ainsi que le contraste entre son succès et ses pauvres performances en tant que président, l’ont élevé au rang de mythe. Ce travail de recherche présente une analyse pragmatique linguistique des stratégies discursives utilisées par le président Garcia dans son deuxième mandat (2001-2006). L’analyse sera centrée sur le rapport établi par Steven Pinker (2007) entre politesse positive et solidarité communale. Les travaux de Brown et Levinson (1978, 1987) et d’Alan Fiske (1991) sont notre base théorique. L’exclusion sociale d’une partie de la population électorale péruvienne, selon le point de vue de Vergara (2007), est l’élément clé pour mieux comprendre le succès de la stratégie discursive de Garcia. Vegara présente une analyse diachronique multi-variable de la situation politique péruvienne pour expliquer la rationalité de la population électorale péruvienne. À partir de cet encadrement théorique, nous procéderons à l’analyse lexicométrique qui nous permettra d’identifier les stratégies discursives utilisées dans le corpus des discours de Garcia qui a été choisi pour l’analyse. D’après le schéma de Pinker, les données obtenues seront classifiées selon la définition de politesse positive de Brown et Levinson. Finalement, nous évaluerons le rapport entre les résultats classifiés et le modèle de solidarité communale de Fiske. L’objectif est de démontrer que le style discursif de Garcia est structuré à partir d’une rationalité dont l’objet est de fermer la brèche sociale entre le politicien et l’électorat.