792 resultados para computer-based technology
Resumo:
PURPOSE: Mutations in IDH3B, an enzyme participating in the Krebs cycle, have recently been found to cause autosomal recessive retinitis pigmentosa (arRP). The MDH1 gene maps within the RP28 arRP linkage interval and encodes cytoplasmic malate dehydrogenase, an enzyme functionally related to IDH3B. As a proof of concept for candidate gene screening to be routinely performed by ultra high throughput sequencing (UHTs), we analyzed MDH1 in a patient from each of the two families described so far to show linkage between arRP and RP28. METHODS: With genomic long-range PCR, we amplified all introns and exons of the MDH1 gene (23.4 kb). PCR products were then sequenced by short-read UHTs with no further processing. Computer-based mapping of the reads and mutation detection were performed by three independent software packages. RESULTS: Despite the intrinsic complexity of human genome sequences, reads were easily mapped and analyzed, and all algorithms used provided the same results. The two patients were homozygous for all DNA variants identified in the region, which confirms previous linkage and homozygosity mapping results, but had different haplotypes, indicating genetic or allelic heterogeneity. None of the DNA changes detected could be associated with the disease. CONCLUSIONS: The MDH1 gene is not the cause of RP28-linked arRP. Our experimental strategy shows that long-range genomic PCR followed by UHTs provides an excellent system to perform a thorough screening of candidate genes for hereditary retinal degeneration.
Resumo:
ABSTRACT Adult neuronal plasticity is a term that corresponds to a set of biological mechanisms allowing a neuronal circuit to respond and adapt to modifications of the received inputs. Mystacial whiskers of the mouse are the starting point of a major sensory pathway that provides the animal with information from its immediate environment. Through whisking, information is gathered that allows the animal to orientate itself and to recognize objects. This sensory system is crucial for nocturnal behaviour during which vision is not of much use. Sensory information of the whiskers are sent via brainstem and thalamus to the primary somatosensory area (S1) of the cerebral cortex in a strictly topological manner. Cell bodies in the layer N of S 1 are arranged in ring forming structures called barrels. As such, each barrel corresponds to the cortical representation in layer IV of a single whisker follicle. This histological feature allows to identify with uttermost precision the part of the cortex devoted to a given whisker and to study modifications induced by different experimental conditions. The condition used in the studies of my thesis is the passive stimulation of one whisker in the adult mouse for a period of 24 hours. It is performed by glueing a piece of metal on one whisker and placing the awake animal in a cage surrounded by an electromagnetic coil that generates magnetic field burst inducing whisker movement at a given frequency during 24 hours. I analysed the ultrastructure of the barrel corresponding the stimulated whisker using serial sections electron microscopy and computer-based three-dimensional reconstructions; analysis of neighbouring, unstimulated barrels as well as those from unstimulated mice served as control. The following elements were structurally analyzed: the spiny dendrites, the axons of excitatory as well as inhibitory cells, their connections via synapses and the astrocytic processes. The density of synapses and spines is upregulated in a barrel corresponding to a stimulated whisker. This upregulation is absent in the BDNF heterozygote mice, indicating that a certain level of activity-dependent released BDNF is required for synaptogenesis in the adult cerebral cortex. Synpaptogenesis is correlated with a modification of the astrocytes that place themselves in closer vicinity of the excitatory synapses on spines. Biochemical analysis revealed that the astrocytes upregulate the expression of transporters by which they internalise glutamate, the neurotransmitter responsible for the excitatory response of cortical neurons. In the final part of my thesis, I show that synaptogenesis in the stimulated barrel is due to the increase in the size of excitatory axonal boutons that become more frequently multisynaptic, whereas the inhibitory axons do not change their morphology but form more synapses with spines apposed to them. Taken together, my thesis demonstrates that all the cellular elements present in the neuronal tissue of the adult brain contribute to activity-dependent cortical plasticity and form part of a mechanism by which the animal responds to a modified sensory experience. Throughout life, the neuronal circuit keeps the faculty to adapt its function. These adaptations are partially transitory but some aspects remain and could be the structural basis of a memory trace in the cortical circuit. RESUME La plasticité neuronale chez l'adulte désigne un ensemble de mécanismes biologiques qui permettent aux circuits neuronaux de répondre et de s'adapter aux modifications des stimulations reçues. Les vibrisses des souris sont un système crucial fournissant des informations sensorielles au sujet de l'environnement de l'animal. L'information sensorielle collectée par les vibrisses est envoyée via le tronc cérébral et le thalamus à l'aire sensorielle primaire (S 1) du cortex cérébral en respectant strictement la somatotopie. Les corps cellulaires dans la couche IV de S 1 sont organisés en anneaux délimitant des structures nommées tonneaux. Chaque tonneau reçoit l'information d'une seule vibrisse et l'arrangement des tonneaux dans le cortex correspond à l'arrangement des vibrisses sur le museau de la souris. Cette particularité histologique permet de sélectionner avec certitude la partie du cortex dévolue à une vibrisse et de l'étudier dans diverses conditions. Le paradigme expérimental utilisé dans cette thèse est la stimulation passive d'une seule vibrisse durant 24 heures. Pour ce faire, un petit morceau de métal est collé sur une vibrisse et la souris est placée dans une cage entourée d'une bobine électromagnétique générant un champ qui fait vibrer le morceau de métal durant 24 heures. Nous analysons l'ultrastructure du cortex cérébral à l'aide de la microscopie électronique et des coupes sériées permettant la reconstruction tridimensionnelle à l'aide de logiciels informatiques. Nous observons les modifications des structures présentes : les dendrites épineuses, les axones des cellules excitatrices et inhibitrices, leurs connections par des synapses et les astrocytes. Le nombre de synapses et d'épines est augmenté dans un tonneau correspondant à une vibrisse stimulée 24 heures. Basé sur cela, nous montrons dans ces travaux que cette réponse n'est pas observée dans des souris hétérozygotes BDNF+/-. Cette neurotrophine sécrétée en fonction de l'activité neuronale est donc nécessaire pour la synaptogenèse. La synaptogenèse est accompagnée d'une modification des astrocytes qui se rapprochent des synapses excitatrices au niveau des épines dendritiques. Ils expriment également plus de transporteurs chargés d'internaliser le glutamate, le neurotransmetteur responsable de la réponse excitatrice des neurones. Nous montrons aussi que les axones excitateurs deviennent plus larges et forment plus de boutons multi-synaptiques à la suite de la stimulation tandis que les axones inhibiteurs ne changent pas de morphologie mais forment plus de synapses avec des épines apposées à leur membrane. Tous les éléments analysés dans le cerveau adulte ont maintenu la capacité de réagir aux modifications de l'activité neuronale et répondent aux modifications de l'activité permettant une constante adaptation à de nouveaux environnements durant la vie. Les circuits neuronaux gardent la capacité de créer de nouvelles synapses. Ces adaptations peuvent être des réponses transitoires aux stimuli mais peuvent aussi laisser une trace mnésique dans les circuits.
Resumo:
How communication systems emerge and remain stable is an important question in both cognitive science and evolutionary biology. For communication to arise, not only must individuals cooperate by signaling reliable information, but they must also coordinate and perpetuate signals. Most studies on the emergence of communication in humans typically consider scenarios where individuals implicitly share the same interests. Likewise, most studies on human cooperation consider scenarios where shared conventions of signals and meanings cannot be developed de novo. Here, we combined both approaches with an economic experiment where participants could develop a common language, but under different conditions fostering or hindering cooperation. Participants endeavored to acquire a resource through a learning task in a computer-based environment. After this task, participants had the option to transmit a signal (a color) to a fellow group member, who would subsequently play the same learning task. We varied the way participants competed with each other (either global scale or local scale) and the cost of transmitting a signal (either costly or noncostly) and tracked the way in which signals were used as communication among players. Under global competition, players signaled more often and more consistently, scored higher individual payoffs, and established shared associations of signals and meanings. In addition, costly signals were also more likely to be used under global competition; whereas under local competition, fewer signals were sent and no effective communication system was developed. Our results demonstrate that communication involves both a coordination and a cooperative dilemma and show the importance of studying language evolution under different conditions influencing human cooperation.
Resumo:
Peer-reviewed
Resumo:
Our efforts are directed towards the understanding of the coscheduling mechanism in a NOW system when a parallel job is executed jointly with local workloads, balancing parallel performance against the local interactive response. Explicit and implicit coscheduling techniques in a PVM-Linux NOW (or cluster) have been implemented. Furthermore, dynamic coscheduling remains an open question when parallel jobs are executed in a non-dedicated Cluster. A basis model for dynamic coscheduling in Cluster systems is presented in this paper. Also, one dynamic coscheduling algorithm for this model is proposed. The applicability of this algorithm has been proved and its performance analyzed by simulation. Finally, a new tool (named Monito) for monitoring the different queues of messages in such an environments is presented. The main aim of implementing this facility is to provide a mean of capturing the bottlenecks and overheads of the communication system in a PVM-Linux cluster.
Resumo:
Many classification systems rely on clustering techniques in which a collection of training examples is provided as an input, and a number of clusters c1,...cm modelling some concept C results as an output, such that every cluster ci is labelled as positive or negative. Given a new, unlabelled instance enew, the above classification is used to determine to which particular cluster ci this new instance belongs. In such a setting clusters can overlap, and a new unlabelled instance can be assigned to more than one cluster with conflicting labels. In the literature, such a case is usually solved non-deterministically by making a random choice. This paper presents a novel, hybrid approach to solve this situation by combining a neural network for classification along with a defeasible argumentation framework which models preference criteria for performing clustering.
Resumo:
Tässä työssä on selvitetty sellutehtaan höyryverkosta tehtaan ulkopuolelle myytävän ylijäämähöyryn määrän ja paineen nopeaan vaihteluun vaikuttavia tekijöitä. Työssä on tarkasteltu höyryn kehityksen ja kulutuksen vaihtelun vaikutusta ylijäämähöyryyn. Lisäksi on tarkasteltu mahdollisuuksia edellä mainittujen häiriöiden tasaamiseksi. Työssä on selvitetty teoriaa, joka vaikuttaa sellutehtaan höyryn kehitykseen ja kulutukseen. Lisäksi on selvitetty energiataselaskennan ja höyryverkon hallintaa parantavien toimenpiteiden teoriaa. Omana kokonaisuutena on sellutehtaan höyryn kehityksen ja kulutuksen tarkastelu sekä selvitys tehtaan höyryverkon hallinnan nykytilasta. Höyryverkolle on muodostettu energiatase. Työn tuloksia varten on kerätty ja tallennettu mittapistetietoa tiedonkeräysjärjestelmän avulla eri höyryverkon mittapisteistä. Työn tuloksina on mainittu useita höyryverkon hallintaa parantavia toteutuskelpoisia asioita ja toimenpiteitä. Työllä on luotu pohjaa menetelmälle, joka ohjaa energian kehitystä vastaamaan sellun tuotannon tarvitsemaa energiamäärää. Samalla saataisiin paremmin hallittua ylijäämähöyryä ja sen määrän sekä paineen vaihtelu vähentyisi.
Resumo:
Tässä diplomityössä on suunniteltu ja toteutettu tuotannon optimointijärjestelmä Kotkan Energia Oy:n tuotantolaitoksille sekä osto- ja myyntisopimuksille. Työssä on kerätty ja laskettu Kotkan Energian tuotantolaitosten teknisiä ja suunnittelutietoja, sekä esitelty eri laitteiden ja tuotantolaitosten toimintaa. Lisäksi on käyty läpi kaukolämmön ja prosessihöyryn kulutusta, kulutuksien ennustamista ja tuotantojärjestelmiä, sekä sähkökauppaa ja laitoksilla käytettävien polttoaineiden ominaisuuksia ja hintoja. Voimalaitoksen laitteille on laskettu hyötysuhteet ja tuotteiden hinnat erilaisilla kuormilla. Laskelmien avulla on tehty polynomisovitteet laitteiden hyötysuhteille. Polynomisovitteet ja muu laitosten toiminnasta kerätty ja laskettu tieto on siirretty Kotkan Energian tietohallintopäällikön kanssa yhteistyössä kehitettyyn tietokonepohjaiseen optimointiohjelmaan. Myös optimoinnin teoriaa ja menetelmiä on käyty lyhyesti läpi. Optimointiohjelman avulla pystytään nyt laskemaan erilaisten kaukolämmön, prosessihöyryn ja sähkön hintojen ja kulutusten mukaisia optimaalisia ajotapoja Hovinsaaren voimalaitoksen tuotannolle ja hankintasopimuksille. Optimointiohjelmalla pyritään maksimoimaan energiantuotannon kokonaistuottoa tai minimoimaan tuotantokustannuksia annettujen ja ennustettujen alkuarvojen mukaisesti. Ohjelman antamia tuloksia voidaan käyttää apuna esimerkiksi tuotannon suunnittelussa ja budjetoinnissa. Laskelmat ja ohjelman kehittäminen ovat onnistuneet hyvin ja ohjelman käyttämisestä ja testaamisesta saadut tulokset vaikuttavat oikeanlaisilta ja luotettavilta. Optimointiohjelma on nyt käytössä ja jatkokehittely jatkuu myös diplomityön valmistumisen jälkeen.
Resumo:
Diplomityö on tehty UPM-Kymmene Oyj, Kaukaan tehtailla Lappeenrannassa. Integroidussa metsäteollisuudessa energiantuotanto koostuu yleensä sähkön- ja lämmöntuotannosta. Kaukaan tehtailla prosessien lämmöntarve saadaan katettua kokonaisuudessaan omalla tuotannolla, kun taas kulutetusta sähköstä ainoastaan puolet on tuotettu itse. Loput sähköntarpeesta joudutaan ostamaan ulkopuolelta. Tutkimuksen pääpaino on ollut selvittää, miten kustannukset ovat riippuvaisia energiantuotannosta erilaisissa käyttöolosuhteissa. Työn tuloksena on luotu tietokonepohjainen laskentamalli, jonka avulla Kaukaan tehtaiden energiantuotantoa voidaan ohjata taloudellisesti optimaalisimmalla tavalla kulloinkin vallitsevassa käyttötilanteessa. Lisäksi tutkimuksessa on analysoitu tehdasintegraatin lämmönkulutuksen seurannan mahdollisuuksia lämmönsiirtoverkon nykyisten mittausten perusteella. Työssä on kerrottu yleisesti metsäteollisuuden energiankulutuksesta Suomessa. Lisäksi on esitetty arvioita energiankulutuksen kehityksestä tulevaisuudessa sekä keinoja energiatehokkuuden parantamiseksi. Kaukaan tehtailla lämmönkulutuksen seurantaan käytettävät mittausmenetelmät ja -laitteet on esitelty virtausmittausten osalta sekä arvioitu nykyisten mittausten luotettavuutta ja riittävyyttä kokonaisvaltaisen lämpötaseen hallintaan. Kaukaan tehtaiden energiantuotantojärjestelmästä on luotu termodynaaminen malli, johon energiantuotannosta aiheutuneiden kustannusten laskenta perustuu. Energiantuotannon optimoinnilla pyritään määrittelemään tietyn tarkasteluhetken käyttötilanteessa taloudellisesti optimaalisin kattiloiden ajojärjestys. Tarkastelu on rajattu lämmöntuotannon lisäämisen osalta maakaasun käytön lisäämiseen ja höyryturbiinien ohitukseen. Sähkön ja maakaasun hinnan sekä ympäristön lämpötilan vaihtelujen vaikutusta optimaaliseen ajojärjestykseen on havainnollistettu esimerkkien avulla.
Resumo:
Clopidogrel is a widely used antiplatelet drug used in preventing vascular events after suffering a first stoke. Genome-wide association studies (GWAS) has not been able to establish a clear association between polymorphisms and recurrence. Therefore in the present final master project an epigenetic approach is proposed. Using an array based technology, 450.000 CpG sites across all genome were assessed in 48 individuals (21 cases and 21 controls). Looking at differentially methylated levels between cases and controls, 58 CpG sites (DMGs) were found. Although, no clear locus was observed. Looking individually to each 49 genes, two appeared to be important to our study. TRAF3 and ADAMTS2 are gens highly related to platelet aggregation. In orther to confirm these result, a new DNA methylation study will be done in a larger cohort, using Sequenom technology.
Resumo:
Abstract Objective: The present study was aimed at investigating bone involvement secondary to rotator cuff calcific tendonitis at ultrasonography. Materials and Methods: Retrospective study of a case series. The authors reviewed shoulder ultrasonography reports of 141 patients diagnosed with rotator cuff calcific tendonitis, collected from the computer-based data records of their institution over a four-year period. Imaging findings were retrospectively and consensually analyzed by two experienced musculoskeletal radiologists looking for bone involvement associated with calcific tendonitis. Only the cases confirmed by computed tomography were considered for descriptive analysis. Results: Sonographic findings of calcific tendinopathy with bone involvement were observed in 7/141 (~ 5%) patients (mean age, 50.9 years; age range, 42-58 years; 42% female). Cortical bone erosion adjacent to tendon calcification was the most common finding, observed in 7/7 cases. Signs of intraosseous migration were found in 3/7 cases, and subcortical cysts in 2/7 cases. The findings were confirmed by computed tomography. Calcifications associated with bone abnormalities showed no acoustic shadowing at ultrasonography, favoring the hypothesis of resorption phase of the disease. Conclusion: Preliminary results of the present study suggest that ultrasonography can identify bone abnormalities secondary to rotator cuff calcific tendinopathy, particularly the presence of cortical bone erosion.
Resumo:
The Potentiometric Stripping Analysis (PSA) is described with emphasis on ultramicroelectrode applications with a laboratory developed computer based instrumentation. The technique potentialities as compared with the voltammetric approach are pointed out based on the current literature. Some results of trace metals analysis including zinc, cadmium, lead and copper in vinegar and canned food samples are presented. The mainly advantage found in our laboratory was the technique capability to analyse natural samples with minimum matrix interference and the low level of noise found in our determinations.
Resumo:
The development of new tools for chemoinformatics, allied to the use of different algorithms and computer programmes for structure elucidation of organic compounds, is growing fast worldwide. Massive efforts in research and development are currently being pursued both by academia and the so-called chemistry software development companies. The demystification of this environment provoked by the availability of software packages and a vast array of publications exert a positive impact on chemistry. In this work, an overview concerning the more classical approaches as well as new strategies on computer-based tools for structure elucidation of organic compounds is presented. Historical background is also taken into account since these techniques began to develop around four decades ago. Attention will be paid to companies which develop, distribute or commercialize software as well as web-based and open access tools which are currently available to chemists.
Resumo:
El software lliure està tenint últimament un pes cada cop més important en les empreses, però encara és el gran desconegut per a molta gent. Des de la seva creació als anys 80 fins ara, hi ha hagut un creixement exponencial de software lliure de gran qualitat, oferint eines per a tot tipus de necessitats, eines ofimàtiques, gestors de correu, sistemes de fitxer, sistemes operatius…. Tot aquest moviment no ha passat desapercebut per a molts usuaris i empreses, que s’han aprofitat d’ell per cobrir les seves necessitats. Pel que fa a les empreses, cada cop n’hi ha més que en petita o gran mesura, utilitzen el software lliure, ja sigui per el seu menor cost d’adquisició, o bé per la seva gran fiabilitat o per que és fàcilment adaptable o per no establir cap lligam tecnològic, en definitiva per tenir més llibertat. En el moment de la creació d’una nova empresa, on es parteix de zero en tota la tecnologia informàtica, és el moment menys costòs d’implementar l’arquitectura informàtica amb software lliure, és quan l’impacte que té sobre l’empresa, usuaris i clients és menor. En les empreses que ja tenen un sistema informàtic, caldrà establir un pla de migració, ja sigui total o parcial. La finalitat d’aquest projecte no és la de dir quin software és millor que l’altre o de dir quin s’ha d’instal•lar, sinó el de donar a conèixer el món del software lliure, mostrar part d’aquest software, fer alguna comparativa de software lliure amb software propietari, donant idees i un conjunt de solucions per a empreses, per què una empresa pugui agafar idees d’implementació d’algunes de les solucions informàtiques exposades o seguir algun dels consells proposats. Actualment ja hi ha moltes empreses que utilitzen software lliure. Algunes només n’utilitzen una petita part en les seves instal•lacions, ja que el fet de que una empresa funcioni al 100% amb software lliure, tot i que n’hi comença ha haver, de moment ho considero una mica arriscat, però que en poc temps, aquest fet serà cada cop més habitual.
Resumo:
Efficient designs and operations of water and wastewater treatment systems are largely based on mathematical calculations. This even applies to training in the treatment systems. Therefore, it is necessary that calculation procedures are developed and computerised a priori for such applications to ensure effectiveness. This work was aimed at developing calculation procedures for gas stripping, depth filtration, ion exchange, chemical precipitation, and ozonation wastewater treatment technologies to include them in ED-WAVE, a portable computer based tool used in design, operations and training in wastewater treatment. The work involved a comprehensive online and offline study of research work and literature, and application of practical case studies to generate ED-WAVE compatible representations of the treatment technologies which were then uploaded into the tool.