49 resultados para Knowledge Acquisition and Sharing
Resumo:
BACKGROUND AND PURPOSE: Knowledge of cerebral blood flow (CBF) alterations in cases of acute stroke could be valuable in the early management of these cases. Among imaging techniques affording evaluation of cerebral perfusion, perfusion CT studies involve sequential acquisition of cerebral CT sections obtained in an axial mode during the IV administration of iodinated contrast material. They are thus very easy to perform in emergency settings. Perfusion CT values of CBF have proved to be accurate in animals, and perfusion CT affords plausible values in humans. The purpose of this study was to validate perfusion CT studies of CBF by comparison with the results provided by stable xenon CT, which have been reported to be accurate, and to evaluate acquisition and processing modalities of CT data, notably the possible deconvolution methods and the selection of the reference artery. METHODS: Twelve stable xenon CT and perfusion CT cerebral examinations were performed within an interval of a few minutes in patients with various cerebrovascular diseases. CBF maps were obtained from perfusion CT data by deconvolution using singular value decomposition and least mean square methods. The CBF were compared with the stable xenon CT results in multiple regions of interest through linear regression analysis and bilateral t tests for matched variables. RESULTS: Linear regression analysis showed good correlation between perfusion CT and stable xenon CT CBF values (singular value decomposition method: R(2) = 0.79, slope = 0.87; least mean square method: R(2) = 0.67, slope = 0.83). Bilateral t tests for matched variables did not identify a significant difference between the two imaging methods (P >.1). Both deconvolution methods were equivalent (P >.1). The choice of the reference artery is a major concern and has a strong influence on the final perfusion CT CBF map. CONCLUSION: Perfusion CT studies of CBF achieved with adequate acquisition parameters and processing lead to accurate and reliable results.
Resumo:
BACKGROUND AND PURPOSE: Accurate placement of an external ventricular drain (EVD) for the treatment of hydrocephalus is of paramount importance for its functionality and in order to minimize morbidity and complications. The aim of this study was to compare two different drain insertion assistance tools with the traditional free-hand anatomical landmark method, and to measure efficacy, safety and precision. METHODS: Ten cadaver heads were prepared by opening large bone windows centered on Kocher's points on both sides. Nineteen physicians, divided in two groups (trainees and board certified neurosurgeons) performed EVD insertions. The target for the ventricular drain tip was the ipsilateral foramen of Monro. Each participant inserted the external ventricular catheter in three different ways: 1) free-hand by anatomical landmarks, 2) neuronavigation-assisted (NN), and 3) XperCT-guided (XCT). The number of ventricular hits and dangerous trajectories; time to proceed; radiation exposure of patients and physicians; distance of the catheter tip to target and size of deviations projected in the orthogonal plans were measured and compared. RESULTS: Insertion using XCT increased the probability of ventricular puncture from 69.2 to 90.2 % (p = 0.02). Non-assisted placements were significantly less precise (catheter tip to target distance 14.3 ± 7.4 mm versus 9.6 ± 7.2 mm, p = 0.0003). The insertion time to proceed increased from 3.04 ± 2.06 min. to 7.3 ± 3.6 min. (p < 0.001). The X-ray exposure for XCT was 32.23 mSv, but could be reduced to 13.9 mSv if patients were initially imaged in the hybrid-operating suite. No supplementary radiation exposure is needed for NN if patients are imaged according to a navigation protocol initially. CONCLUSION: This ex vivo study demonstrates a significantly improved accuracy and safety using either NN or XCT-assisted methods. Therefore, efforts should be undertaken to implement these new technologies into daily clinical practice. However, the accuracy versus urgency of an EVD placement has to be balanced, as the image-guided insertion technique will implicate a longer preparation time due to a specific image acquisition and trajectory planning.
Resumo:
Game theory describes and analyzes strategic interaction. It is usually distinguished between static games, which are strategic situations in which the players choose only once as well as simultaneously, and dynamic games, which are strategic situations involving sequential choices. In addition, dynamic games can be further classified according to perfect and imperfect information. Indeed, a dynamic game is said to exhibit perfect information, whenever at any point of the game every player has full informational access to all choices that have been conducted so far. However, in the case of imperfect information some players are not fully informed about some choices. Game-theoretic analysis proceeds in two steps. Firstly, games are modelled by so-called form structures which extract and formalize the significant parts of the underlying strategic interaction. The basic and most commonly used models of games are the normal form, which rather sparsely describes a game merely in terms of the players' strategy sets and utilities, and the extensive form, which models a game in a more detailed way as a tree. In fact, it is standard to formalize static games with the normal form and dynamic games with the extensive form. Secondly, solution concepts are developed to solve models of games in the sense of identifying the choices that should be taken by rational players. Indeed, the ultimate objective of the classical approach to game theory, which is of normative character, is the development of a solution concept that is capable of identifying a unique choice for every player in an arbitrary game. However, given the large variety of games, it is not at all certain whether it is possible to device a solution concept with such universal capability. Alternatively, interactive epistemology provides an epistemic approach to game theory of descriptive character. This rather recent discipline analyzes the relation between knowledge, belief and choice of game-playing agents in an epistemic framework. The description of the players' choices in a given game relative to various epistemic assumptions constitutes the fundamental problem addressed by an epistemic approach to game theory. In a general sense, the objective of interactive epistemology consists in characterizing existing game-theoretic solution concepts in terms of epistemic assumptions as well as in proposing novel solution concepts by studying the game-theoretic implications of refined or new epistemic hypotheses. Intuitively, an epistemic model of a game can be interpreted as representing the reasoning of the players. Indeed, before making a decision in a game, the players reason about the game and their respective opponents, given their knowledge and beliefs. Precisely these epistemic mental states on which players base their decisions are explicitly expressible in an epistemic framework. In this PhD thesis, we consider an epistemic approach to game theory from a foundational point of view. In Chapter 1, basic game-theoretic notions as well as Aumann's epistemic framework for games are expounded and illustrated. Also, Aumann's sufficient conditions for backward induction are presented and his conceptual views discussed. In Chapter 2, Aumann's interactive epistemology is conceptually analyzed. In Chapter 3, which is based on joint work with Conrad Heilmann, a three-stage account for dynamic games is introduced and a type-based epistemic model is extended with a notion of agent connectedness. Then, sufficient conditions for backward induction are derived. In Chapter 4, which is based on joint work with Jérémie Cabessa, a topological approach to interactive epistemology is initiated. In particular, the epistemic-topological operator limit knowledge is defined and some implications for games considered. In Chapter 5, which is based on joint work with Jérémie Cabessa and Andrés Perea, Aumann's impossibility theorem on agreeing to disagree is revisited and weakened in the sense that possible contexts are provided in which agents can indeed agree to disagree.
Resumo:
The respective roles of the medial temporal lobe (MTL) structures in memory are controversial. Some authors put forward a modular account according to which episodic memory and recollection-based processes are crucially dependent on the hippocampal formation whereas semantic acquisition and familiarity-based processes rely on the adjacent parahippocampal gyri. Others defend a unitary view. We report the case of VJ, a boy with developmental amnesia of most likely perinatal onset diagnosed at the age of 8. Magnetic resonance imaging (MRI), including quantitative volumetric measurements of the hippocampal formation and of the entorhinal, perirhinal, and temporopolar cortices, showed severe, bilateral atrophy of the hippocampal formation, fornix and mammillary bodies; by contrast, the perirhinal cortex was within normal range and the entorhinal and temporopolar cortex remained within two standard deviations (SDs) from controls' mean. We examined the development of his semantic knowledge from childhood to teenage as well as his recognition and cued recall memory abilities. On tasks tapping semantic memory, VJ increased his raw scores across years at the same rate as children from large standardisation samples, except for one task; he achieved average performance, consistent with his socio-educational background. He performed within normal range on 74% of recognition tests and achieved average to above average scores on 42% of them despite very severe impairment on 82% of episodic recall tasks. Both faces and landscapes-scenes gave rise to above average scores when tested with coloured stimuli. Cued recall, although impaired, was largely superior to free recall. This case supports a modular account of the MTL with episodic, but not semantic memory depending on the hippocampal formation. Furthermore, the overall pattern of findings is consistent with evidence from both brain-damaged and neuroimaging studies indicating that recollection requires intact hippocampal formation and familiarity relies, at least partly, on the adjacent temporal lobe cortex.
Resumo:
RésuméL'addiction aux drogues est une maladie multifactorieile affectant toutes les strates de notre société. Cependant, la vulnérabilité à développer une addiction dépend de facteurs environnementaux, génétiques et psychosociaux. L'addiction aux drogues est décrite comme étant une maladie chronique avec un taux élevé de rechutes. Elle se caractérise par un besoin irrépressible de consommer une drogue et une augmentation progressive de la consommation en dépit des conséquences néfastes. Les mécanismes cérébraux responsables des dépendances aux drogues ne sont que partiellement élucidés, malgré une accumulation croissante d'évidences démontrant des adaptations au niveau moléculaire et cellulaire au sein des systèmes dopaminergique et glutamatergique. L'identification de nouveaux facteurs neurobiologiques responsables de la vulnérabilité aux substances d'abus est cruciale pour le développement de nouveaux traitements thérapeutiques capables d'atténuer et de soulager les symptômes liés à la dépendance aux drogues.Au cours des dernières années, de nombreuses études ont démontré qu'un nouveau circuit cérébral, le système hypocrétinergique, était impliqué dans plusieurs fonctions physiologiques, tel que l'éveil, le métabolisme énergétique, la motivation, le stress et les comportements liés aux phénomènes de récompense. Le système hypocrétinergique est composé d'environ 3000-4000 neurones issus de l'hypothalamus latéral projetant dans tout ie cerveau. Des souris transgéniques pour le gène des hypocrétines ont été générées et leur phénotype correspond à celui des animaux sauvages, excepté le fait qu'elles soient atteintes d'attaques de sommeil similaires à celles observées chez les patients narcoleptiques. H semblerait que les hypocrétines soient requises pour l'acquisition et l'expression de la dépendance aux drogues. Cependant, le mécanisme précis reste encore à être élucidé. Dans ce rapport, nous rendons compte des comportements liés aux phénomènes de récompense liés à l'alcool et à la cocaine chez les souris knock-out (KO), hétérozygotes (HET) et sauvages (WT).Nous avons, dans un premier temps, évalué l'impact d'injections répétées de cocaïne (15 mg/kg, ip) sur la sensibilisation locomotrice et sur le conditionnement place préférence. Nous avons pu observer que les souris WT, HET et KO exprimaient une sensibilisation locomotrice induite par une administration chronique de cocaïne, cependant les souris déficientes en hypocrétines démontraient une sensibilisation retardée et atténuée. Π est intéressant de mentionner que les mâles HET exprimaient une sensibilisation comportementale intermédiaire. Après normalisation des données, toutes les souris exprimaient une amplitude de sensibilisation similaire, excepté les souris mâles KO qui affichaient, le premier jour de traitement, une sensibilisation locomotrice réduite et retardée, reflétant un phénotype hypoactif plutôt qu'une altération de la réponse aux traitements chroniques de cocaïne. Contre toute attente, toutes les souris femelles exprimaient un pattern similaire de sensibilisation locomotrice à la cocaïne. Nous avons ensuite évalué l'effet d'un conditionnement comportemental à un environnement associé à des injections répétées de cocaine (15 mg / kg ip). Toutes les souris, quelque soit leur sexe ou leur génotype, ont manifesté une préférence marquée pour l'environnement apparié à la cocaïne. Après deux semaines d'abstinence à la cocaïne, les mâles et les femelles déficientes en hypocrétines n'exprimaient plus aucune préférence pour le compartiment précédemment associé à la cocaïne. Alors que les souris WT et HET maintenaient leur préférence pour le compartiment associé à la cocaïne. Pour finir, à l'aide d'un nouveau paradigme appelé IntelliCage®, nous avons pu évaluer la consommation de liquide chez les femelles WT, HET et KO. Lorsqu'il n'y avait que de l'eau disponible, nous avons observé que les femelles KO avaient tendance à moins explorer les quatre coins de la cage. Lorsque les souris étaient exposées à quatre types de solutions différentes (eau, ImM quinine ou 0.2% saccharine, alcool 8% et alcool 16%), les souris KO avaient tendance à moins consommer l'eau sucrée et les solutions alcoolisées. Cependant, après normalisation des données, aucune différence significative n'a pu être observée entre les différents génotypes, suggérant que la consommation réduite d'eau sucrée ou d'alcool peut être incombée à l'hypoactivité des souris KO.Ces résultats confirment que le comportement observé chez les souris KO serait dû à des compensations développementales, puisque la sensibilisation locomotrice et le conditionnement comportemental à la cocaïne étaient similaires aux souris HET et WT. En ce qui concerne la consommation de liquide, les souris KO avaient tendance à consommer moins d'eau sucrée et de solutions alcoolisées. Le phénotype hypoactif des souris déficientes en hypocrétine est probablement responsable de leur tendance à moins explorer leur environnement. Il reste encore à déterminer si l'expression de ce phénotype est la conséquence d'un état de vigilance amoindri ou d'une motivation diminuée à la recherche de récompense. Nos résultats suggèrent que les souris déficientes en hypocrétine affichent une motivation certaine à la recherche de récompense lorsqu'elles sont exposées à des environnements où peu d'efforts sont à fournir afin d'obtenir une récompense.AbstractDrug addiction is a multifactorial disorder affecting human beings regardless their education level, their economic status, their origin or even their gender, but the vulnerability to develop addiction depends on environmental, genetic and psychosocial dispositions. Drug addiction is defined as a chronic relapsing disorder characterized by compulsive drug seeking, with loss of control over drug intake and persistent maladaptive decision making in spite of adverse consequences. The brain mechanisms responsible for drug abuse remain partially unknown despite accumulating evidence delineating molecular and cellular adaptations within the glutamatergic and the dopaminergic systems. However, these adaptations do not fully explain the complex brain disease of drug addiction. The identification of other neurobiological factors responsible for the vulnerability to substance abuse is crucial for the development of promising therapeutic treatments able to alleviate signs of drug dependence.For the past few years, growing evidence demonstrated that a recently discovered brain circuit, the hypocretinergic system, is implicated in many physiological functions, including arousal, energy metabolism, motivation, stress and reward-related behaviors. The hypocretin system is composed of a few thousands neurons arising from the lateral hypothalamus and projecting to the entire brain. Hypocretin- deficient mice have been generated, and unexpectedly, their phenotype resembles that of wild type mice excepting sleep attacks strikingly similar to those of human narcolepsy patients. Evidence suggesting that hypocretins are required for the acquisition and the expression of drug addiction has also been reported; however the precise mechanism by which hypocretins modulate drug seeking behaviors remains a matter of debate. Here, we report alcohol and cocaine reward-related behaviors in hypocretin-deficient mice (KO), as well as heterozygous (HET) and wild type (WT) littermates.We first evaluated the impact of repeated cocaine injections (15 mg/kg, ip) on locomotor sensitization and conditioned place preference. We observed that WT, HET and KO mice exhibited behavioral sensitization following repeated cocaine administrations, but hypocretin deficient males displayed a delayed and attenuated response to chronic cocaine administrations. Interestingly, HET males exhibited an intermediate pattern of behavioral sensitization. However, after standardization of the post-injection data versus the period of habituation prior to cocaine injections, all mice displayed similar amplitudes of behavioral sensitization, except a reduced response in KO males on the first day, suggesting that the delayed and reduced cocaine-induced locomotor sensitization may reflect a hypoactive phenotype and probably not an altered response to repeated cocaine administrations. Unexpectedly, all female mice exhibited similar patterns of cocaine-induced behavioral sensitization. We then assessed the behavioral conditioning for an environment repeatedly paired with cocaine injections (15 mg/kg ip). All mice, whatever their gender or genotype, exhibited a robust preference for the environment previously paired with cocaine administrations. Noteworthy, following two weeks of cocaine abstinence, hypocretin-deficient males and females no longer exhibited any preference for the compartment previously paired with cocaine rewards whereas both WT and HET mice continued manifesting a robust preference. We finally assessed drinking behaviors in WT, HET and KO female mice using a novel paradigm, the IntelliCages®. We report here that KO females tended to less explore the four cage comers where water was easily available. When exposed to four different kinds of liquid solutions (water, ImM quinine or saccharine 0.2%, alcohol 8% and alcohol 16%), KO mice tended to less consume the sweet and the alcoholic beverages. However, after data standardization, no significant differences were noticed between genotypes suggesting that the hypoactive phenotype is most likely accountable for the trend regarding the reduced sweet or alcohol intake in KO.Taken together, the present findings confirm that the behavior seen in Hcrt KO mice likely reflects developmental compensations since only a slightly altered cocaine-induced behavioral sensitization and a normal behavioral conditioning with cocaine were observed in these mice compared to HET and WT littermates. With regards to drinking behaviors, KO mice barely displayed any behavioral changes but a trend for reducing sweet and alcoholic beverages. Overall, the most striking observation is the constant hypoactive phenotype seen in the hypocretin-deficient mice that most likely is accountable for their reduced tendency to explore the environment. Whether this hypoactive phenotype is due to a reduced alertness or reduced motivation for reward seeking remains debatable, but our findings suggest that the hypocretin-deficient mice barely display any altered motivation for reward seeking in environments where low efforts are required to access to a reward.
Resumo:
Since the early 1990s, new forms of referendum campaigns have emerged in the Swiss political arena. In this paper, we examine how referendum campaigns have transformed in Switzerland, focusing on a number of features: their intensity, duration and inclusiveness (i.e., the variety of actors involved). These features are assumed to change in the long run in response to societal changes and in the short run as a function of variations in elite support. We further argue that public knowledge of ballot issues depends on the characteristics of campaigns. To formally test our hypotheses, we draw on advertisement campaigns in six major Swiss newspapers in the four weeks preceding each ballot from 1981 to 1999 and develop a structural equation model. We indeed find that the duration of referendum campaigns has increased over time, while their inclusiveness has decreased. Most importantly, we find that public knowledge is strongly related to the characteristics of campaigns
Resumo:
The application of two approaches for high-throughput, high-resolution X-ray phase contrast tomographic imaging being used at the tomographic microscopy and coherent radiology experiments (TOMCAT) beamline of the SLS is discussed and illustrated. Differential phase contrast (DPC) imaging, using a grating interferometer and a phase-stepping technique, is integrated into the beamline environment at TOMCAT in terms of the fast acquisition and reconstruction of data and the availability to scan samples within an aqueous environment. A second phase contrast method is a modified transfer of intensity approach that can yield the 3D distribution of the decrement of the refractive index of a weakly absorbing object from a single tomographic dataset. The two methods are complementary to one another: the DPC method is characterised by a higher sensitivity and by moderate resolution with larger samples; the modified transfer of intensity approach is particularly suited for small specimens when high resolution (around 1 mu m) is required. Both are being applied to investigations in the biological and materials science fields.
Resumo:
Brain perfusion can be assessed by CT and MR. For CT, two major techniques are used. First, Xenon CT is an equilibrium technique based on a freely diffusible tracer. First pass of iodinated contrast injected intravenously is a second method, more widely available. Both methods are proven to be robust and quantitative, thanks to the linear relationship between contrast concentration and x-ray attenuation. For the CT methods, concern regarding x-ray doses delivered to the patients need to be addressed. MR is also able to assess brain perfusion using the first pass of gadolinium based contrast agent injected intravenously. This method has to be considered as a semi-quantitative because of the non linear relationship between contrast concentration and MR signal changes. Arterial spin labeling is another MR method assessing brain perfusion without injection of contrast. In such case, the blood flow in the carotids is magnetically labelled by an external radiofrequency pulse and observed during its first pass through the brain. Each of this various CT and MR techniques have advantages and limits that will be illustrated and summarized.Learning Objectives:1. To understand and compare the different techniques for brain perfusion imaging.2. To learn about the methods of acquisition and post-processing of brain perfusion by first pass of contrast agent for CT and MR.3. To learn about non contrast MR methods (arterial spin labelling).
Resumo:
NanoImpactNet (NIN) is a multidisciplinary European Commission funded network on the environmental, health and safety (EHS) impact of nanomaterials. The 24 founding scientific institutes are leading European research groups active in the fields of nanosafety, nanorisk assessment and nanotoxicology. This 4-year project is the new focal point for information exchange within the research community. Contact with other stakeholders is vital and their needs are being surveyed. NIN is communicating with 100s of stakeholders: businesses; internet platforms; industry associations; regulators; policy makers; national ministries; international agencies; standard-setting bodies and NGOs concerned by labour rights, EHS or animal welfare. To improve this communication, internet research, a questionnaire distributed via partners and targeted phone calls were used to identify stakeholders' interests and needs. Knowledge gaps and the necessity for further data mentioned by representatives of all stakeholder groups in the targeted phone calls concerned: • the potential toxic and safety hazards of nanomaterials throughout their lifecycles; • the fate and persistence of nanoparticles in humans, animals and the environment; • the associated risks of nanoparticle exposure; • greater participation in: the preparation of nomenclature, standards, methodologies, protocols and benchmarks; • the development of best practice guidelines; • voluntary schemes on responsibility; • databases of materials, research topics and themes, but also of expertise. These findings suggested that stakeholders and NIN researchers share very similar knowledge needs, and that open communication and free movement of knowledge will benefit both researchers and industry. Subsequently a workshop was organised by NIN focused on building a sustainable multi-stakeholder dialogue. Specific questions were asked to different stakeholder groups to encourage discussions and open communication. 1. What information do stakeholders need from researchers and why? The discussions about this question confirmed the needs identified in the targeted phone calls. 2. How to communicate information? While it was agreed that reporting should be enhanced, commercial confidentiality and economic competition were identified as major obstacles. It was recognised that expertise was needed in the areas of commercial law and economics for a wellinformed treatment of this communication issue. 3. Can engineered nanomaterials be used safely? The idea that nanomaterials are probably safe because some of them have been produced 'for a long time', was questioned, since many materials in common use have been proved to be unsafe. The question of safety is also about whether the public has confidence. New legislation like REACH could help with this issue. Hazards do not materialise if exposure can be avoided or at least significantly reduced. Thus, there is a need for information on what can be regarded as acceptable levels of exposure. Finally, it was noted that there is no such thing as a perfectly safe material but only boundaries. At this moment we do not know where these boundaries lie. The matter of labelling of products containing nanomaterials was raised, as in the public mind safety and labelling are connected. This may need to be addressed since the issue of nanomaterials in food, drink and food packaging may be the first safety issue to attract public and media attention, and this may have an impact on 'nanotechnology as a whole. 4. Do we need more or other regulation? Any decision making process should accommodate the changing level of uncertainty. To address the uncertainties, adaptations of frameworks such as REACH may be indicated for nanomaterials. Regulation is often needed even if voluntary measures are welcome because it mitigates the effects of competition between industries. Data cannot be collected on voluntary bases for example. NIN will continue with an active stakeholder dialogue to further build on interdisciplinary relationships towards a healthy future with nanotechnology.
Resumo:
Knockout mice lacking alphalb noradrenergic receptors were tested in behavioural experiments to test a possible effect of the absence of this receptor in reaction to novelty and spatial orientation. Reaction to novelty was tested in two experiments. In the first one the mice' latency to exit the first part of a two compartment set-up was measured. The knockout mice were faster to emerge then their littermate controls. Then they were tested in an open-field, in which new objects were added at the second trial. In the open-field without objects (first trial), the knockout mice showed a greater locomotor activity (path length). Then the same mice showed enhanced exploration of the newly introduced objects, relative to the control. The spatial orientation experiments were done on a homing board and in the water maze. The homing board did not yield a significant difference between the knock-out and the control mice. Both groups showed impaired results when the proximal (olfactory) and distal (visual) cues were disrupted by the rotation of the table. In the water maze however, the alphalb(-/-) mice were unable to solve the task (acquisition and retention), whereas the control mice showed a good acquisition and retention behaviour. The knockout mice' incapacity to learn to reach the submerged platform was not due to an incapacity to swim, as they were as good as their control littermates to reach the platform when it was visible.
Resumo:
On 9 October 1963 a catastrophic landslide suddenly occurred on the southern slope of the Vaiont dam reservoir. A mass of approximately 270 million m3 collapsed into the reservoir generating a wave that overtopped the dam and hit the town of Longarone and other villages nearby. Several investigations and interpretations of the slope collapse have been carried out during the last 45 years, however, a comprehensive explanation of both the triggering and the dynamics of the phenomenon has yet to be provided. In order to re-evaluate the currently existing information on the slide, an electronic bibliographic database and an ESRI-geodatabase have been developed. The chronology of the collected documentation showed that most of the studies for re-evaluating the failure mechanisms were conducted in the last decade, as a consequence of knowledge, methods and techniques recently acquired. The current contents of the geodatabase will improve definition of the structural setting that influenced the slide and led to the the propagation of the displaced rock mass. The objectives, structure and contents of the e-bibliography and Geodatabase are indicated, together with a brief description on the possible use of the alphanumeric and spatial contents of the databases.
Resumo:
Arbuscular mycorrhizal fungi (AMF) are important symbionts of plants that improve plant nutrient acquisition and promote plant diversity. Although within-species genetic differences among AMF have been shown to differentially affect plant growth, very little is actually known about the degree of genetic diversity in AMF populations. This is largely because of difficulties in isolation and cultivation of the fungi in a clean system allowing reliable genotyping to be performed. A population of the arbuscular mycorrhizal fungus Glomus intraradices growing in an in vitro cultivation system was studied using newly developed simple sequence repeat (SSR), nuclear gene intron and mitochondrial ribosomal gene intron markers. The markers revealed a strong differentiation at the nuclear and mitochondrial level among isolates. Genotypes were nonrandomly distributed among four plots showing genetic subdivisions in the field. Meanwhile, identical genotypes were found in geographically distant locations. AMF genotypes showed significant preferences to different host plant species (Glycine max, Helianthus annuus and Allium porrum) used before the fungal in vitro culture establishment. Host plants in a field could provide a heterogeneous environment favouring certain genotypes. Such preferences may partly explain within-population patterns of genetic diversity.
Resumo:
In this paper we present the procedure we followed to develop the Italian Super Sense Tagger. In particular, we adapted the English SuperSense Tagger to the Italian Language by exploiting a parallel sense labeled corpus for training. As for English, the Italian tagger uses a fixed set of 26 semantic labels, called supersenses, achieving a slightly lower accuracy due to the lower quality of the Italian training data. Both taggers accomplish the same task of identifying entities and concepts belonging to a common set of ontological types. This parallelism allows us to define effective methodologies for a broad range of cross-language knowledge acquisition tasks.
Resumo:
In 2008 we published the first set of guidelines for standardizing research in autophagy. Since then, research on this topic has continued to accelerate, and many new scientists have entered the field. Our knowledge base and relevant new technologies have also been expanding. Accordingly, it is important to update these guidelines for monitoring autophagy in different organisms. Various reviews have described the range of assays that have been used for this purpose. Nevertheless, there continues to be confusion regarding acceptable methods to measure autophagy, especially in multicellular eukaryotes. A key point that needs to be emphasized is that there is a difference between measurements that monitor the numbers or volume of autophagic elements (e.g., autophagosomes or autolysosomes) at any stage of the autophagic process vs. those that measure flux through the autophagy pathway (i.e., the complete process); thus, a block in macroautophagy that results in autophagosome accumulation needs to be differentiated from stimuli that result in increased autophagic activity, defined as increased autophagy induction coupled with increased delivery to, and degradation within, lysosomes (in most higher eukaryotes and some protists such as Dictyostelium) or the vacuole (in plants and fungi). In other words, it is especially important that investigators new to the field understand that the appearance of more autophagosomes does not necessarily equate with more autophagy. In fact, in many cases, autophagosomes accumulate because of a block in trafficking to lysosomes without a concomitant change in autophagosome biogenesis, whereas an increase in autolysosomes may reflect a reduction in degradative activity. Here, we present a set of guidelines for the selection and interpretation of methods for use by investigators who aim to examine macroautophagy and related processes, as well as for reviewers who need to provide realistic and reasonable critiques of papers that are focused on these processes. These guidelines are not meant to be a formulaic set of rules, because the appropriate assays depend in part on the question being asked and the system being used. In addition, we emphasize that no individual assay is guaranteed to be the most appropriate one in every situation, and we strongly recommend the use of multiple assays to monitor autophagy. In these guidelines, we consider these various methods of assessing autophagy and what information can, or cannot, be obtained from them. Finally, by discussing the merits and limits of particular autophagy assays, we hope to encourage technical innovation in the field.
Resumo:
Although homology is a fundamental concept in biology and is one of the shared channels of communication universal to all biology, it is difficult to find a consensus definition. Indeed, the interpretations of homology have changed as biology has progressed. New terms, such as paramorphism, have been introduced into the literature with mixed success. In addition, different research fields operate with different definitions of homology, for example the mechanistic usage of evo-devo is not strictly historical and would not be acceptable in cladistics. This makes a global understanding of homology complex, whereas the integration of evolutionary concepts into bioinformatics and genomics is increasingly important. We propose an ontology organizing homology and related concepts and hope this solution will also facilitate the integration and sharing of knowledge among the community.