84 resultados para Multichip modules


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The recognition that colorectal cancer (CRC) is a heterogeneous disease in terms of clinical behaviour and response to therapy translates into an urgent need for robust molecular disease subclassifiers that can explain this heterogeneity beyond current parameters (MSI, KRAS, BRAF). Attempts to fill this gap are emerging. The Cancer Genome Atlas (TGCA) reported two main CRC groups, based on the incidence and spectrum of mutated genes, and another paper reported an EMT expression signature defined subgroup. We performed a prior free analysis of CRC heterogeneity on 1113 CRC gene expression profiles and confronted our findings to established molecular determinants and clinical, histopathological and survival data. Unsupervised clustering based on gene modules allowed us to distinguish at least five different gene expression CRC subtypes, which we call surface crypt-like, lower crypt-like, CIMP-H-like, mesenchymal and mixed. A gene set enrichment analysis combined with literature search of gene module members identified distinct biological motifs in different subtypes. The subtypes, which were not derived based on outcome, nonetheless showed differences in prognosis. Known gene copy number variations and mutations in key cancer-associated genes differed between subtypes, but the subtypes provided molecular information beyond that contained in these variables. Morphological features significantly differed between subtypes. The objective existence of the subtypes and their clinical and molecular characteristics were validated in an independent set of 720 CRC expression profiles. Our subtypes provide a novel perspective on the heterogeneity of CRC. The proposed subtypes should be further explored retrospectively on existing clinical trial datasets and, when sufficiently robust, be prospectively assessed for clinical relevance in terms of prognosis and treatment response predictive capacity. Original microarray data were uploaded to the ArrayExpress database (http://www.ebi.ac.uk/arrayexpress/) under Accession Nos E-MTAB-990 and E-MTAB-1026. © 2013 Swiss Institute of Bioinformatics. Journal of Pathology published by John Wiley & Sons Ltd on behalf of Pathological Society of Great Britain and Ireland.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The endodermis is a root cell layer common to higher plants and of fundamental importance for root function and nutrient uptake. The endodermis separates outer (peripheral) from inner (central) cell layers by virtue of its Casparian strips, precisely aligned bands of specialized wall material. Here we reveal that the membrane at the Casparian strip is a diffusional barrier between the central and peripheral regions of the plasma membrane and that it mediates attachment to the extracellular matrix. This membrane region thus functions like a tight junction in animal epithelia, although plants lack the molecular modules that establish tight junction in animals. We have also identified a pair of influx and efflux transporters that mark both central and peripheral domains of the plasma membrane. These transporters show opposite polar distributions already in meristems, but their localization becomes refined and restricted upon differentiation. This "central-peripheral" polarity coexists with the apical-basal polarity defined by PIN proteins within the same cells, but utilizes different polarity determinants. Central-peripheral polarity can be already observed in early embryogenesis, where it reveals a cellular polarity within the quiescent center precursor cell. A strict diffusion block between polar domains is common in animals, but had never been described in plants. Yet, its relevance to endodermal function is evident, as central and peripheral membranes of the endodermis face fundamentally different root compartments. Further analysis of endodermal transporter polarity and manipulation of its barrier function will greatly promote our understanding of plant nutrition and stress tolerance in roots.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Cell morphogenesis depends on polarized exocytosis. One widely held model posits that long-range transport and exocyst-dependent tethering of exocytic vesicles at the plasma membrane sequentially drive this process. Here, we describe that disruption of either actin-based long-range transport and microtubules or the exocyst did not abolish polarized growth in rod-shaped fission yeast cells. However, disruption of both actin cables and exocyst led to isotropic growth. Exocytic vesicles localized to cell tips in single mutants but were dispersed in double mutants. In contrast, a marker for active Cdc42, a major polarity landmark, localized to discreet cortical sites even in double mutants. Localization and photobleaching studies show that the exocyst subunits Sec6 and Sec8 localize to cell tips largely independently of the actin cytoskeleton, but in a cdc42 and phospholipid phosphatidylinositol 4,5-bisphosphate (PIP₂)-dependent manner. Thus in fission yeast long-range cytoskeletal transport and PIP₂-dependent exocyst represent parallel morphogenetic modules downstream of Cdc42, raising the possibility of similar mechanisms in other cell types.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introduction. Quantification of daily upper-limb activity is a key determinant in evaluation of shoulder surgery. For a number of shoulder diseases, problem in performing daily activities have been expressed in terms of upper-limb usage and non-usage. Many instruments measure upper-limb movement but do not focus on the differentiations between the use of left or right shoulder. Several methods have been used to measure it using only accelerometers, pressure sensors or video-based analysis. However, there is no standard or widely used objective measure for upper-limb movement. We report here on an objective method to measure the movement of upper-limb and we examined the use of 3D accelerometers and 3D gyroscopes for that purpose. Methods. We studied 8 subjects with unilateral pathological shoulder (8 rotator cuff disease: 53 years old ± 8) and compared them to 18 control subjects (10 right handed, 8 left handed: 32 years old ± 8, younger than the patient group to be almost sure they don_t have any unrecognized shoulder pathology). The Simple Shoulder Test (SST) and Disabilities of the Arm and Shoulder Score (DASH) questionnaires were completed by each subject. Two modules with 3 miniature capacitive gyroscopes and 3 miniature accelerometers were fixed by a patch on the dorsal side of the distal humerus, and one module with 3 gyroscopes and 3 accelerometers were fixed on the thorax. The subject wore the system during one day (8 hours), at home or wherever he/she went. We used a technique based on the 3D acceleration and the 3D angular velocities from the modules attached on the humerus. Results. As expected, we observed that for the stand and sit postures the right side is more used than the left side for a healthy right-handed person(idem on the left side for a healthy left-handed person). Subjects used their dominant upper-limb 18% more than the non-dominant upper-limb. The measurements on patients in daily life have shown that the patient has used more his non affected and non dominant side during daily activity if the dominant side = affected shoulder. If the dominant side affected shoulder, the difference can be showed only during walking period. Discussion-Conclusion. The technique developed and used allowed the quantification of the difference between dominant and non dominant side, affected and unaffected upper-limb activity. These results were encouraging for future evaluation of patients with shoulder injuries, before and after surgery. The feasibility and patient acceptability of the method using body fixed sensors for ambulatory evaluation of upper limbs kinematics was shown.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Abstract Bacterial genomes evolve through mutations, rearrangements or horizontal gene transfer. Besides the core genes encoding essential metabolic functions, bacterial genomes also harbour a number of accessory genes acquired by horizontal gene transfer that might be beneficial under certain environmental conditions. The horizontal gene transfer contributes to the diversification and adaptation of microorganisms, thus having an impact on the genome plasticity. A significant part of the horizontal gene transfer is or has been facilitated by genomic islands (GEIs). GEIs are discrete DNA segments, some of which are mobile and others which are not, or are no longer mobile, which differ among closely related strains. A number of GEIs are capable of integration into the chromosome of the host, excision, and transfer to a new host by transformation, conjugation or transduction. GEIs play a crucial role in the evolution of a broad spectrum of bacteria as they are involved in the dissemination of variable genes, including antibiotic resistance and virulence genes leading to generation of hospital 'superbugs', as well as catabolic genes leading to formation of new metabolic pathways. Depending on the composition of gene modules, the same type of GEIs can promote survival of pathogenic as well as environmental bacteria.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

INTRODUCTION: Developments in technology, web-based teaching and whole slide imaging have broadened the teaching horizon in anatomic pathology. Creating online learning material including many types of media such as radiologic images, whole slides, videos, clinical and macroscopic photographs, is now accessible to most universities. Unfortunately, a major limiting factor to maintain and update the learning material is the amount of resources needed. In this perspective, a French-national university network was initiated in 2011 to build joint online teaching modules consisting of clinical cases and tests. The network has since expanded internationally to Québec, Switzerland and Ivory Coast. METHOD: One of the first steps of the project was to build a learning module on inflammatory skin pathology for interns and residents in pathology and dermatology. A pathology resident from Québec spent 6 weeks in France and Switzerland to develop the contents and build the module on an e-learning Moodle platform under the supervision of two dermatopathologists. The learning module contains text, interactive clinical cases, tests with feedback, virtual slides, images and clinical photographs. For that module, the virtual slides are decentralized in 2 universities (Bordeaux and Paris 7). Each university is responsible of its own slide scanning, image storage and online display with virtual slide viewers. RESULTS: The module on inflammatory skin pathology includes more than 50 web pages with French original content, tests and clinical cases, links to over 45 virtual images and more than 50 microscopic and clinical photographs. The whole learning module is being revised by four dermatopathologists and two senior pathologists. It will be accessible to interns and residents in the spring of 2014. The experience and knowledge gained from that work will be transferred to the next international resident whose work will be aimed at creating lung and breast pathology learning modules. CONCLUSION: The challenges of sustaining a project of this scope are numerous. The technical aspect of whole-slide imaging and storage needs to be developed by each university or group. The content needs to be regularly updated and its accuracy reviewed by experts in each individual domain. The learning modules also need to be promoted within the academic community to ensure maximal benefit for trainees. A collateral benefit of the project was the establishment of international partnerships between French-speaking universities and pathologists with the common goal of promoting pathology education through the use of multi-media technology including whole slide imaging.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

RESUME : Cette étude est une analyse métrique et stylistique de La Pulcella d'Orléans de Vincenzo Monti - traduction-réécriture de l'homonyme poème de Voltaire, La Pucelle d'Orléans - commencée à Milan en 1798 et terminée à Chambéry, en Savoie, en 1799. Le texte italien a été considéré comme une version autonome par rapport au texte français, étant donné le particulier choix de réduire la composante philosophique et idéologique d'origine, et de mettre en relation le modèle avec une littérature italienne spécifique, principalement par l'adoption d'une grille strophique fortement marquée. La Pulcella est traduite en octaves, un mètre chevaleresque qui possède au moins depuis trois siècles sa propre "grammaire" ainsi qu'une formidable tradition de référence. De plus, avec sa traduction, l'auteur a voulu mettre l'accent sur les aspects de l'histoire les plus amusantes et provocatrices de Jeanne d'Arc - déjà narrée par Voltaire avec un ton ironique et irrévérencieux - dans le but d'une grande expérimentation au niveau de la langue, de la métrique et de la syntaxe. La traduction de la Pucelle est en effet liée à une dimension hédonistique et livresque: elle n'est pas un prétexte pour connaitre une oeuvre étrangère, ni un texte conçu pour être publiée; il s'agit plutôt d'un exercice personnel, un divertissement privé, demeuré dans le tiroir de l'auteur. Alors que pour Voltaire le but principal du poème est la polémique idéologique du fond, exprimée par un registre fort satirique, pour Monti la réécriture est un jeu stylistique, une complaisance littéraire, qui repose autant sur les composantes désacralisantes et provocatrices que sur les éléments poétiques et idylliques. Le modèle français est donc retravaillé, en premier lieu, au niveau du ton: d'un côté la traduction réduit l'horizon idéologique et la perspective historique des événements; de l'autre elle accroît les aspects les plus hédonistiques et ludiques de Voltaire, par la mise en évidence de l'élément comique, plus coloré et ouvert. En raison de la dimension intime de cette traduction, de nos jours la tradition de la Pulcella italienne se fonde sur trois témoins manuscrits seulement, dont un retrouvé en 1984 et qui a rouvert le débat philologique. Pour ma thèse j'ai utilisé l'édition critique qu'on possède à présent, imprimée en 1982 sous la direction de M. Mari et G. Barbarisi, qui se fonde seulement sur deux témoins du texte; de toute façon mon travail a essayé de considérer aussi en compte le nouvel autographe découvert. Ce travail de thèse sur la Pulcella est organisé en plusieurs chapitres qui reflètent la structure de l'analyse, basée sur les différents niveaux d'élaboration du texte. Au début il y a une introduction générale, où j'ai encadré les deux versions, la française et l'italienne, dans l'histoire littéraire, tout en donnant des indications sur la question philologique relative au texte de Monti. Ensuite, les chapitres analysent quatre aspects différents de la traduction: d'abord, les hendécasyllabes du poème: c'est à dire le rythme des vers, la prosodie et la distribution des différents modules rythmiques par rapport aux positions de l'octave. La Pucelle de Voltaire est en effet écrite en décasyllabes, un vers traditionnellement assez rigide à cause de son rythme coupé par la césure; dans la traduction le vers français est rendu par la plus célèbre mesure de la tradition littéraire italienne, l'endécasyllabe, un vers qui correspond au décasyllabe seulement pour le nombre de syllabes, mais qui présente une majeure liberté rythmique pour la disposition des accents. Le deuxième chapitre considère le mètre de l'octave, en mettant l'accent sur l'organisation syntaxique interne des strophes et sur les liens entre elles ; il résulte que les strophes sont traitées de manière différente par rapport à Voltaire. En effet, au contraire des octaves de Monti, la narration française se développe dans chaque chant en une succession ininterrompue de vers, sans solutions de continuité, en délinéant donc des structures textuelles très unitaires et linéaires. Le troisième chapitre analyse les enjambements de la Pulcella dans le but de dévoiler les liaisons syntactiques entre les verses et les octaves, liaisons presque toujours absentes en Voltaire. Pour finir, j'ai étudié le vocabulaire du poème, en observant de près les mots les plus expressives quant à leur côté comique et parodique. En effet, Monti semble exaspérer le texte français en utilisant un vocabulaire très varié, qui embrasse tous les registres de la langue italienne: de la dimension la plus basse, triviale, populaire, jusqu'au niveau (moins exploité par Voltaire) lyrique et littéraire, en vue d'effets de pastiche comique et burlesque. D'après cette analyse stylistique de la traduction, surgit un aspect très intéressant et unique de la réécriture de Monti, qui concerne l'utilisation soit de l'endécasyllabe, soit de l'octave, soit du vocabulaire du texte. Il s'agit d'un jeu constant sur la voix - ou bien sur une variation continue des différents plans intonatives - et sur la parole, qui devient plus expressive, plus dense. En effet, la lecture du texte suppose une variation mélodique incessante entre la voix de l'auteur (sous forme de la narration et du commentaire) et la voix de personnages, qu'on entend dans les nombreux dialogues; mais aussi une variation de ton entre la dimension lexical littéraire et les registres les plus baissés de la langue populaire. Du point de vue de la syntaxe, par rapport au modèle français (qui est assez monotone et linéaire, basé sur un ordre syntactique normal, sur le rythme régulier du decasyllabe et sur un langage plutôt ordinaire), Monti varie et ennoblit le ton du discours à travers des mouvements syntaxiques raffinés, des constructions de la période plus ou moins réguliers et l'introduction de propositions à cheval des vers. Le discours italien est en effet compliquée par des interruptions continues (qui ne se réalisent pas dans des lieux canoniques, mais plutôt dans la première partie du vers ou en proximité de la pointe) qui marquent des changements de vitesse dans le texte (dialogues, narration, commentaires): ils se vérifient, en somme, des accélérations et des décélérations continues du récit ainsi qu'un jeu sur les ouvertures et fermetures de chaque verse. Tout se fait à travers une recherche d'expressivité qui, en travaillant sur la combinaison et le choc des différents niveaux, déstabilise la parole et rend l'écriture imprévisible.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The transcriptome is the readout of the genome. Identifying common features in it across distant species can reveal fundamental principles. To this end, the ENCODE and modENCODE consortia have generated large amounts of matched RNA-sequencing data for human, worm and fly. Uniform processing and comprehensive annotation of these data allow comparison across metazoan phyla, extending beyond earlier within-phylum transcriptome comparisons and revealing ancient, conserved features. Specifically, we discover co-expression modules shared across animals, many of which are enriched in developmental genes. Moreover, we use expression patterns to align the stages in worm and fly development and find a novel pairing between worm embryo and fly pupae, in addition to the embryo-to-embryo and larvae-to-larvae pairings. Furthermore, we find that the extent of non-canonical, non-coding transcription is similar in each organism, per base pair. Finally, we find in all three organisms that the gene-expression levels, both coding and non-coding, can be quantitatively predicted from chromatin features at the promoter using a 'universal model' based on a single set of organism-independent parameters.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: Developing and updating high-quality guidelines requires substantial time and resources. To reduce duplication of effort and enhance efficiency, we developed a process for guideline adaptation and assessed initial perceptions of its feasibility and usefulness. METHODS: Based on preliminary developments and empirical studies, a series of meetings with guideline experts were organised to define a process for guideline adaptation (ADAPTE) and to develop a manual and a toolkit made available on a website (http://www.adapte.org). Potential users, guideline developers and implementers, were invited to register and to complete a questionnaire evaluating their perception about the proposed process. RESULTS: The ADAPTE process consists of three phases (set-up, adaptation, finalisation), 9 modules and 24 steps. The adaptation phase involves identifying specific clinical questions, searching for, retrieving and assessing available guidelines, and preparing the draft adapted guideline. Among 330 registered individuals (46 countries), 144 completed the questionnaire. A majority found the ADAPTE process clear (78%), comprehensive (69%) and feasible (60%), and the manual useful (79%). However, 21% found the ADAPTE process complex. 44% feared that they will not find appropriate and high-quality source guidelines. DISCUSSION: A comprehensive framework for guideline adaptation has been developed to meet the challenges of timely guideline development and implementation. The ADAPTE process generated important interest among guideline developers and implementers. The majority perceived the ADAPTE process to be feasible, useful and leading to improved methodological rigour and guideline quality. However, some de novo development might be needed if no high quality guideline exists for a given topic.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Understanding the complexity of cancer depends on an elucidation of the underlying regulatory networks, at the cellular and intercellular levels and in their temporal dimension. This Opinion article focuses on the multilevel crosstalk between the Notch pathway and the p53 and p63 pathways. These two coordinated signalling modules are at the interface of external damaging signals and control of stem cell potential and differentiation. Positive or negative reciprocal regulation of the two pathways can vary with cell type and cancer stage. Therefore, selective or combined targeting of the two pathways could improve the efficacy and reduce the toxicity of cancer therapies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

La biologie de la conservation est communément associée à la protection de petites populations menacées d?extinction. Pourtant, il peut également être nécessaire de soumettre à gestion des populations surabondantes ou susceptibles d?une trop grande expansion, dans le but de prévenir les effets néfastes de la surpopulation. Du fait des différences tant quantitatives que qualitatives entre protection des petites populations et contrôle des grandes, il est nécessaire de disposer de modèles et de méthodes distinctes. L?objectif de ce travail a été de développer des modèles prédictifs de la dynamique des grandes populations, ainsi que des logiciels permettant de calculer les paramètres de ces modèles et de tester des scénarios de gestion. Le cas du Bouquetin des Alpes (Capra ibex ibex) - en forte expansion en Suisse depuis sa réintroduction au début du XXème siècle - servit d?exemple. Cette tâche fut accomplie en trois étapes : En premier lieu, un modèle de dynamique locale, spécifique au Bouquetin, fut développé : le modèle sous-jacent - structuré en classes d?âge et de sexe - est basé sur une matrice de Leslie à laquelle ont été ajoutées la densité-dépendance, la stochasticité environnementale et la chasse de régulation. Ce modèle fut implémenté dans un logiciel d?aide à la gestion - nommé SIM-Ibex - permettant la maintenance de données de recensements, l?estimation automatisée des paramètres, ainsi que l?ajustement et la simulation de stratégies de régulation. Mais la dynamique d?une population est influencée non seulement par des facteurs démographiques, mais aussi par la dispersion et la colonisation de nouveaux espaces. Il est donc nécessaire de pouvoir modéliser tant la qualité de l?habitat que les obstacles à la dispersion. Une collection de logiciels - nommée Biomapper - fut donc développée. Son module central est basé sur l?Analyse Factorielle de la Niche Ecologique (ENFA) dont le principe est de calculer des facteurs de marginalité et de spécialisation de la niche écologique à partir de prédicteurs environnementaux et de données d?observation de l?espèce. Tous les modules de Biomapper sont liés aux Systèmes d?Information Géographiques (SIG) ; ils couvrent toutes les opérations d?importation des données, préparation des prédicteurs, ENFA et calcul de la carte de qualité d?habitat, validation et traitement des résultats ; un module permet également de cartographier les barrières et les corridors de dispersion. Le domaine d?application de l?ENFA fut exploré par le biais d?une distribution d?espèce virtuelle. La comparaison à une méthode couramment utilisée pour construire des cartes de qualité d?habitat, le Modèle Linéaire Généralisé (GLM), montra qu?elle était particulièrement adaptée pour les espèces cryptiques ou en cours d?expansion. Les informations sur la démographie et le paysage furent finalement fusionnées en un modèle global. Une approche basée sur un automate cellulaire fut choisie, tant pour satisfaire aux contraintes du réalisme de la modélisation du paysage qu?à celles imposées par les grandes populations : la zone d?étude est modélisée par un pavage de cellules hexagonales, chacune caractérisée par des propriétés - une capacité de soutien et six taux d?imperméabilité quantifiant les échanges entre cellules adjacentes - et une variable, la densité de la population. Cette dernière varie en fonction de la reproduction et de la survie locale, ainsi que de la dispersion, sous l?influence de la densité-dépendance et de la stochasticité. Un logiciel - nommé HexaSpace - fut développé pour accomplir deux fonctions : 1° Calibrer l?automate sur la base de modèles de dynamique (par ex. calculés par SIM-Ibex) et d?une carte de qualité d?habitat (par ex. calculée par Biomapper). 2° Faire tourner des simulations. Il permet d?étudier l?expansion d?une espèce envahisseuse dans un paysage complexe composé de zones de qualité diverses et comportant des obstacles à la dispersion. Ce modèle fut appliqué à l?histoire de la réintroduction du Bouquetin dans les Alpes bernoises (Suisse). SIM-Ibex est actuellement utilisé par les gestionnaires de la faune et par les inspecteurs du gouvernement pour préparer et contrôler les plans de tir. Biomapper a été appliqué à plusieurs espèces (tant végétales qu?animales) à travers le Monde. De même, même si HexaSpace fut initialement conçu pour des espèces animales terrestres, il pourrait aisément être étndu à la propagation de plantes ou à la dispersion d?animaux volants. Ces logiciels étant conçus pour, à partir de données brutes, construire un modèle réaliste complexe, et du fait qu?ils sont dotés d?une interface d?utilisation intuitive, ils sont susceptibles de nombreuses applications en biologie de la conservation. En outre, ces approches peuvent également s?appliquer à des questions théoriques dans les domaines de l?écologie des populations et du paysage.<br/><br/>Conservation biology is commonly associated to small and endangered population protection. Nevertheless, large or potentially large populations may also need human management to prevent negative effects of overpopulation. As there are both qualitative and quantitative differences between small population protection and large population controlling, distinct methods and models are needed. The aim of this work was to develop theoretical models to predict large population dynamics, as well as computer tools to assess the parameters of these models and to test management scenarios. The alpine Ibex (Capra ibex ibex) - which experienced a spectacular increase since its reintroduction in Switzerland at the beginning of the 20th century - was used as paradigm species. This task was achieved in three steps: A local population dynamics model was first developed specifically for Ibex: the underlying age- and sex-structured model is based on a Leslie matrix approach with addition of density-dependence, environmental stochasticity and culling. This model was implemented into a management-support software - named SIM-Ibex - allowing census data maintenance, parameter automated assessment and culling strategies tuning and simulating. However population dynamics is driven not only by demographic factors, but also by dispersal and colonisation of new areas. Habitat suitability and obstacles modelling had therefore to be addressed. Thus, a software package - named Biomapper - was developed. Its central module is based on the Ecological Niche Factor Analysis (ENFA) whose principle is to compute niche marginality and specialisation factors from a set of environmental predictors and species presence data. All Biomapper modules are linked to Geographic Information Systems (GIS); they cover all operations of data importation, predictor preparation, ENFA and habitat suitability map computation, results validation and further processing; a module also allows mapping of dispersal barriers and corridors. ENFA application domain was then explored by means of a simulated species distribution. It was compared to a common habitat suitability assessing method, the Generalised Linear Model (GLM), and was proven better suited for spreading or cryptic species. Demography and landscape informations were finally merged into a global model. To cope with landscape realism and technical constraints of large population modelling, a cellular automaton approach was chosen: the study area is modelled by a lattice of hexagonal cells, each one characterised by a few fixed properties - a carrying capacity and six impermeability rates quantifying exchanges between adjacent cells - and one variable, population density. The later varies according to local reproduction/survival and dispersal dynamics, modified by density-dependence and stochasticity. A software - named HexaSpace - was developed, which achieves two functions: 1° Calibrating the automaton on the base of local population dynamics models (e.g., computed by SIM-Ibex) and a habitat suitability map (e.g. computed by Biomapper). 2° Running simulations. It allows studying the spreading of an invading species across a complex landscape made of variously suitable areas and dispersal barriers. This model was applied to the history of Ibex reintroduction in Bernese Alps (Switzerland). SIM-Ibex is now used by governmental wildlife managers to prepare and verify culling plans. Biomapper has been applied to several species (both plants and animals) all around the World. In the same way, whilst HexaSpace was originally designed for terrestrial animal species, it could be easily extended to model plant propagation or flying animals dispersal. As these softwares were designed to proceed from low-level data to build a complex realistic model and as they benefit from an intuitive user-interface, they may have many conservation applications. Moreover, theoretical questions in the fields of population and landscape ecology might also be addressed by these approaches.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Abstract The solvability of the problem of fair exchange in a synchronous system subject to Byzantine failures is investigated in this work. The fair exchange problem arises when a group of processes are required to exchange digital items in a fair manner, which means that either each process obtains the item it was expecting or no process obtains any information on, the inputs of others. After introducing a novel specification of fair exchange that clearly separates safety and liveness, we give an overview of the difficulty of solving such a problem in the context of a fully-connected topology. On one hand, we show that no solution to fair exchange exists in the absence of an identified process that every process can trust a priori; on the other, a well-known solution to fair exchange relying on a trusted third party is recalled. These two results lead us to complete our system model with a flexible representation of the notion of trust. We then show that fair exchange is solvable if and only if a connectivity condition, named the reachable majority condition, is satisfied. The necessity of the condition is proven by an impossibility result and its sufficiency by presenting a general solution to fair exchange relying on a set of trusted processes. The focus is then turned towards a specific network topology in order to provide a fully decentralized, yet realistic, solution to fair exchange. The general solution mentioned above is optimized by reducing the computational load assumed by trusted processes as far as possible. Accordingly, our fair exchange protocol relies on trusted tamperproof modules that have limited communication abilities and are only required in key steps of the algorithm. This modular solution is then implemented in the context of a pedagogical application developed for illustrating and apprehending the complexity of fair exchange. This application, which also includes the implementation of a wide range of Byzantine behaviors, allows executions of the algorithm to be set up and monitored through a graphical display. Surprisingly, some of our results on fair exchange seem contradictory with those found in the literature of secure multiparty computation, a problem from the field of modern cryptography, although the two problems have much in common. Both problems are closely related to the notion of trusted third party, but their approaches and descriptions differ greatly. By introducing a common specification framework, a comparison is proposed in order to clarify their differences and the possible origins of the confusion between them. This leads us to introduce the problem of generalized fair computation, a generalization of fair exchange. Finally, a solution to this new problem is given by generalizing our modular solution to fair exchange

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Résumé: Les gouvernements des pays occidentaux ont dépensé des sommes importantes pour faciliter l'intégration des technologies de l'information et de la communication dans l'enseignement espérant trouver une solution économique à l'épineuse équation que l'on pourrait résumer par la célèbre formule " faire plus et mieux avec moins ". Cependant force est de constater que, malgré ces efforts et la très nette amélioration de la qualité de service des infrastructures, cet objectif est loin d'être atteint. Si nous pensons qu'il est illusoire d'attendre et d'espérer que la technologie peut et va, à elle seule, résoudre les problèmes de qualité de l'enseignement, nous croyons néanmoins qu'elle peut contribuer à améliorer les conditions d'apprentissage et participer de la réflexion pédagogique que tout enseignant devrait conduire avant de dispenser ses enseignements. Dans cette optique, et convaincu que la formation à distance offre des avantages non négligeables à condition de penser " autrement " l'enseignement, nous nous sommes intéressé à la problématique du développement de ce type d'applications qui se situent à la frontière entre les sciences didactiques, les sciences cognitives, et l'informatique. Ainsi, et afin de proposer une solution réaliste et simple permettant de faciliter le développement, la mise-à-jour, l'insertion et la pérennisation des applications de formation à distance, nous nous sommes impliqué dans des projets concrets. Au fil de notre expérience de terrain nous avons fait le constat que (i)la qualité des modules de formation flexible et à distance reste encore très décevante, entre autres parce que la valeur ajoutée que peut apporter l'utilisation des technologies n'est, à notre avis, pas suffisamment exploitée et que (ii)pour réussir tout projet doit, outre le fait d'apporter une réponse utile à un besoin réel, être conduit efficacement avec le soutien d'un " champion ". Dans l'idée de proposer une démarche de gestion de projet adaptée aux besoins de la formation flexible et à distance, nous nous sommes tout d'abord penché sur les caractéristiques de ce type de projet. Nous avons ensuite analysé les méthodologies de projet existantes dans l'espoir de pouvoir utiliser l'une, l'autre ou un panachage adéquat de celles qui seraient les plus proches de nos besoins. Nous avons ensuite, de manière empirique et par itérations successives, défini une démarche pragmatique de gestion de projet et contribué à l'élaboration de fiches d'aide à la décision facilitant sa mise en oeuvre. Nous décrivons certains de ses acteurs en insistant particulièrement sur l'ingénieur pédagogique que nous considérons comme l'un des facteurs clé de succès de notre démarche et dont la vocation est de l'orchestrer. Enfin, nous avons validé a posteriori notre démarche en revenant sur le déroulement de quatre projets de FFD auxquels nous avons participé et qui sont représentatifs des projets que l'on peut rencontrer dans le milieu universitaire. En conclusion nous pensons que la mise en oeuvre de notre démarche, accompagnée de la mise à disposition de fiches d'aide à la décision informatisées, constitue un atout important et devrait permettre notamment de mesurer plus aisément les impacts réels des technologies (i) sur l'évolution de la pratique des enseignants, (ii) sur l'organisation et (iii) sur la qualité de l'enseignement. Notre démarche peut aussi servir de tremplin à la mise en place d'une démarche qualité propre à la FFD. D'autres recherches liées à la réelle flexibilisation des apprentissages et aux apports des technologies pour les apprenants pourront alors être conduites sur la base de métriques qui restent à définir. Abstract: Western countries have spent substantial amount of monies to facilitate the integration of the Information and Communication Technologies (ICT) into Education hoping to find a solution to the touchy equation that can be summarized by the famous statement "do more and better with less". Despite these efforts, and notwithstanding the real improvements due to the undeniable betterment of the infrastructure and of the quality of service, this goal is far from reached. Although we think it illusive to expect technology, all by itself, to solve our economical and educational problems, we firmly take the view that it can greatly contribute not only to ameliorate learning conditions but participate to rethinking the pedagogical approach as well. Every member of our community could hence take advantage of this opportunity to reflect upon his or her strategy. In this framework, and convinced that integrating ICT into education opens a number of very interesting avenues provided we think teaching "out of the box", we got ourself interested in courseware development positioned at the intersection of didactics and pedagogical sciences, cognitive sciences and computing. Hence, and hoping to bring a realistic and simple solution that could help develop, update, integrate and sustain courseware we got involved in concrete projects. As ze gained field experience we noticed that (i)The quality of courseware is still disappointing, amongst others, because the added value that the technology can bring is not made the most of, as it could or should be and (ii)A project requires, besides bringing a useful answer to a real problem, to be efficiently managed and be "championed". Having in mind to propose a pragmatic and practical project management approach we first looked into open and distance learning characteristics. We then analyzed existing methodologies in the hope of being able to utilize one or the other or a combination to best fit our needs. In an empiric manner and proceeding by successive iterations and refinements, we defined a simple methodology and contributed to build descriptive "cards" attached to each of its phases to help decision making. We describe the different actors involved in the process insisting specifically on the pedagogical engineer, viewed as an orchestra conductor, whom we consider to be critical to ensure the success of our approach. Last but not least, we have validated a posteriori our methodology by reviewing four of the projects we participated to and that we think emblematic of the university reality. We believe that the implementation of our methodology, along with the availability of computerized cards to help project managers to take decisions, could constitute a great asset and contribute to measure the technologies' real impacts on (i) the evolution of teaching practices (ii) the organization and (iii) the quality of pedagogical approaches. Our methodology could hence be of use to help put in place an open and distance learning quality assessment. Research on the impact of technologies to learning adaptability and flexibilization could rely on adequate metrics.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

OBJECTIVE: This pilot experimental study tested the feasibility and intended effect of an educational intervention for parents to help them assist their adolescent child with chronic illness (CI) in becoming autonomous. METHODS: A two-phase pre-post pilot intervention study targeting parents of adolescents with CI was conducted. Parents were allocated to group 1 and 2 and received the four-module intervention consecutively. Intended effect was measured through online questionnaires for parents and adolescents before, at 2 months after, and at 4-6 months after the intervention. Feasibility was assessed through an evaluation questionnaire for parents. RESULTS: The most useful considered modules concerned the future of the adolescent and parents and social life. The most valued aspect was to exchange with other parents going through similar problems and receiving a new outlook on their relationship with their child. For parents, improvement trends appeared for shared management, parent protection, and self-efficacy, and worsening trends appeared for coping skills, parental perception of child vulnerability, and parental stress. For adolescents, improvement trends appeared for self-efficacy and parental bonding and worsening trends appeared for shared management and coping skills. CONCLUSION: Parents could benefit from peer-to-peer support and education as they support the needed autonomy development of their child. Future studies should test an online platform for parents to find peer support at all times and places.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: Diabetes represents an increasing health burden worldwide. In 2010, the Public Health Department of the canton of Vaud (Switzerland) launched a regional diabetes programme entitled "Programme cantonal Diabète" (PcD), with the objectives to both decrease the incidence of diabetes and improve care for patients with diabetes. The cohort entitled CoDiab-VD emerged from that programme. It specifically aimed at following quality of diabetes care over time, at evaluating the coverage of the PcD within this canton and at assessing the impact of the PcD on care of patients with diabetes. METHODS/DESIGN: The cohort CoDiab-VD is a prospective population-based cohort study. Patients with diabetes were recruited in two waves (autumn 2011--summer 2012) through community pharmacies. Eligible participants were non-institutionalised adult patients (≥ 18 years) with diabetes diagnosed for at least one year, residing in the canton of Vaud and coming to a participating pharmacy with a diabetes-related prescription. Women with gestational diabetes, people with obvious cognitive impairment or insufficient command of French were not eligible. Self-reported data collected, included the following primary outcomes: processes-of-care indicators (annual checks) and outcomes of care such as HbA1C, (health-related) quality of life measures (Short Form-12 Health Survey--SF-12, Audit of Diabetes-Dependent Quality of Life 19--ADDQoL) and Patient Assessment of Chronic Illness Care (PACIC). Data on diabetes, health status, healthcare utilisation, health behaviour, self-management activities and support, knowledge of, or participation to, campaigns/activities proposed by the PcD, and socio-demographics were also obtained. For consenting participants, physicians provided few additional pieces of information about processes and laboratory results. Participants will be followed once a year, via a mailed self-report questionnaire. The core of the follow-up questionnaires will be similar to the baseline one, with the addition of thematic modules adapting to the development of the PcD. Physicians will be contacted every 2 years. DISCUSSION: CoDiab-VD will allow obtaining a broad picture of the care of patients with diabetes, as well as their needs regarding their chronic condition. The data will be used to evaluate the PcD and help prioritise targeted actions. TRIAL REGISTRATION: This study is registered with ClinicalTrials.gov, identifier NCT01902043, July 9, 2013.