834 resultados para Process of transition towards the market economy
Resumo:
Strategies to construct the physical map of the Trypanosoma cruzi nuclear genome have to capitalize on three main advantages of the parasite genome, namely (a) its small size, (b) the fact that all chromosomes can be defined, and many of them can be isolated by pulse field gel electrophoresis, and (c) the fact that simple Southern blots of electrophoretic karyotypes can be used to map sequence tagged sites and expressed sequence tags to chromosomal bands. A major drawback to cope with is the complexity of T. cruzi genetics, that hinders the construction of a comprehensive genetic map. As a first step towards physical mapping, we report the construction and partial characterization of a T. cruzi CL-Brener genomic library in yeast artificial chromosomes (YACs) that consists of 2,770 individual YACs with a mean insert size of 365 kb encompassing around 10 genomic equivalents. Two libraries in bacterial artificial chromosomes (BACs) have been constructed, BACI and BACII. Both libraries represent about three genome equivalents. A third BAC library (BAC III) is being constructed. YACs and BACs are invaluable tools for physical mapping. More generally, they have to be considered as a common resource for research in Chagas disease
Resumo:
A shift from large to small average sizes was observed in Triatoma infestans and Rhodnius domesticus between field and domestic (or laboratory) conditions of life. It was more pronounced in the female specimens, leading to a subsequent reduction of sexual size dimorphism. This feature is discussed in terms of genetic and populational changes occurring from natural to artificial habitats, in particular those related to population densities. Sexual size dimorphism is then recommended as a new character to be used in the study of species of Triatominae adapting to domestic ecotopes.
Resumo:
The presence of Triatoma rubrovaria in Brazil has only been confirmed in the States of Paraná and Rio Grande do Sul (RS), where it is found naturally infected with Trypanosoma cruzi. In the wild environment it occurs in rocky habitats and has an eclectic diet, feeding from cockroaches, reptiles and mammals. Data from the Chagas Disease Control Program obtained by the Fundação Nacional de Saúde, between 1975 and 1997, indicate a growing domiciliary and peridomiciliary invasion of T. rubrovaria in RS, where it has become the most frequently Triatominae species captured in this state since the control of Triatoma infestans. In order to monitor this process, we analyzed collection data derived from 22 years of control campaigns against T. infestans. Collection data for triatomines from domestic habitats show an inverse relationship, with high numbers of T. infestans and low numbers of T. rubrovaria during 1976-1987, compared to the following ten years, 1986-1997, when the number of T. infestans dropped drastically and that of T. rubrovaria increased. There are no consistent indications of intradomiciliary colonization by T. rubrovaria, since only low numbers of nymphs have been captured in the intradomiciliary ecotopes. Nevertheless, this species appears to have preadaptive characteristics for anthropic ecotopes, and should be kept under constant epidemiological surveillance.
Resumo:
Des progrès significatifs ont été réalisés dans le domaine de l'intégration quantitative des données géophysique et hydrologique l'échelle locale. Cependant, l'extension à de plus grandes échelles des approches correspondantes constitue encore un défi majeur. Il est néanmoins extrêmement important de relever ce défi pour développer des modèles fiables de flux des eaux souterraines et de transport de contaminant. Pour résoudre ce problème, j'ai développé une technique d'intégration des données hydrogéophysiques basée sur une procédure bayésienne de simulation séquentielle en deux étapes. Cette procédure vise des problèmes à plus grande échelle. L'objectif est de simuler la distribution d'un paramètre hydraulique cible à partir, d'une part, de mesures d'un paramètre géophysique pertinent qui couvrent l'espace de manière exhaustive, mais avec une faible résolution (spatiale) et, d'autre part, de mesures locales de très haute résolution des mêmes paramètres géophysique et hydraulique. Pour cela, mon algorithme lie dans un premier temps les données géophysiques de faible et de haute résolution à travers une procédure de réduction déchelle. Les données géophysiques régionales réduites sont ensuite reliées au champ du paramètre hydraulique à haute résolution. J'illustre d'abord l'application de cette nouvelle approche dintégration des données à une base de données synthétiques réaliste. Celle-ci est constituée de mesures de conductivité hydraulique et électrique de haute résolution réalisées dans les mêmes forages ainsi que destimations des conductivités électriques obtenues à partir de mesures de tomographic de résistivité électrique (ERT) sur l'ensemble de l'espace. Ces dernières mesures ont une faible résolution spatiale. La viabilité globale de cette méthode est testée en effectuant les simulations de flux et de transport au travers du modèle original du champ de conductivité hydraulique ainsi que du modèle simulé. Les simulations sont alors comparées. Les résultats obtenus indiquent que la procédure dintégration des données proposée permet d'obtenir des estimations de la conductivité en adéquation avec la structure à grande échelle ainsi que des predictions fiables des caractéristiques de transports sur des distances de moyenne à grande échelle. Les résultats correspondant au scénario de terrain indiquent que l'approche d'intégration des données nouvellement mise au point est capable d'appréhender correctement les hétérogénéitées à petite échelle aussi bien que les tendances à gande échelle du champ hydraulique prévalent. Les résultats montrent également une flexibilté remarquable et une robustesse de cette nouvelle approche dintégration des données. De ce fait, elle est susceptible d'être appliquée à un large éventail de données géophysiques et hydrologiques, à toutes les gammes déchelles. Dans la deuxième partie de ma thèse, j'évalue en détail la viabilité du réechantillonnage geostatique séquentiel comme mécanisme de proposition pour les méthodes Markov Chain Monte Carlo (MCMC) appliquées à des probmes inverses géophysiques et hydrologiques de grande dimension . L'objectif est de permettre une quantification plus précise et plus réaliste des incertitudes associées aux modèles obtenus. En considérant une série dexemples de tomographic radar puits à puits, j'étudie deux classes de stratégies de rééchantillonnage spatial en considérant leur habilité à générer efficacement et précisément des réalisations de la distribution postérieure bayésienne. Les résultats obtenus montrent que, malgré sa popularité, le réechantillonnage séquentiel est plutôt inefficace à générer des échantillons postérieurs indépendants pour des études de cas synthétiques réalistes, notamment pour le cas assez communs et importants où il existe de fortes corrélations spatiales entre le modèle et les paramètres. Pour résoudre ce problème, j'ai développé un nouvelle approche de perturbation basée sur une déformation progressive. Cette approche est flexible en ce qui concerne le nombre de paramètres du modèle et lintensité de la perturbation. Par rapport au rééchantillonage séquentiel, cette nouvelle approche s'avère être très efficace pour diminuer le nombre requis d'itérations pour générer des échantillons indépendants à partir de la distribution postérieure bayésienne. - Significant progress has been made with regard to the quantitative integration of geophysical and hydrological data at the local scale. However, extending corresponding approaches beyond the local scale still represents a major challenge, yet is critically important for the development of reliable groundwater flow and contaminant transport models. To address this issue, I have developed a hydrogeophysical data integration technique based on a two-step Bayesian sequential simulation procedure that is specifically targeted towards larger-scale problems. The objective is to simulate the distribution of a target hydraulic parameter based on spatially exhaustive, but poorly resolved, measurements of a pertinent geophysical parameter and locally highly resolved, but spatially sparse, measurements of the considered geophysical and hydraulic parameters. To this end, my algorithm links the low- and high-resolution geophysical data via a downscaling procedure before relating the downscaled regional-scale geophysical data to the high-resolution hydraulic parameter field. I first illustrate the application of this novel data integration approach to a realistic synthetic database consisting of collocated high-resolution borehole measurements of the hydraulic and electrical conductivities and spatially exhaustive, low-resolution electrical conductivity estimates obtained from electrical resistivity tomography (ERT). The overall viability of this method is tested and verified by performing and comparing flow and transport simulations through the original and simulated hydraulic conductivity fields. The corresponding results indicate that the proposed data integration procedure does indeed allow for obtaining faithful estimates of the larger-scale hydraulic conductivity structure and reliable predictions of the transport characteristics over medium- to regional-scale distances. The approach is then applied to a corresponding field scenario consisting of collocated high- resolution measurements of the electrical conductivity, as measured using a cone penetrometer testing (CPT) system, and the hydraulic conductivity, as estimated from electromagnetic flowmeter and slug test measurements, in combination with spatially exhaustive low-resolution electrical conductivity estimates obtained from surface-based electrical resistivity tomography (ERT). The corresponding results indicate that the newly developed data integration approach is indeed capable of adequately capturing both the small-scale heterogeneity as well as the larger-scale trend of the prevailing hydraulic conductivity field. The results also indicate that this novel data integration approach is remarkably flexible and robust and hence can be expected to be applicable to a wide range of geophysical and hydrological data at all scale ranges. In the second part of my thesis, I evaluate in detail the viability of sequential geostatistical resampling as a proposal mechanism for Markov Chain Monte Carlo (MCMC) methods applied to high-dimensional geophysical and hydrological inverse problems in order to allow for a more accurate and realistic quantification of the uncertainty associated with the thus inferred models. Focusing on a series of pertinent crosshole georadar tomographic examples, I investigated two classes of geostatistical resampling strategies with regard to their ability to efficiently and accurately generate independent realizations from the Bayesian posterior distribution. The corresponding results indicate that, despite its popularity, sequential resampling is rather inefficient at drawing independent posterior samples for realistic synthetic case studies, notably for the practically common and important scenario of pronounced spatial correlation between model parameters. To address this issue, I have developed a new gradual-deformation-based perturbation approach, which is flexible with regard to the number of model parameters as well as the perturbation strength. Compared to sequential resampling, this newly proposed approach was proven to be highly effective in decreasing the number of iterations required for drawing independent samples from the Bayesian posterior distribution.
Resumo:
Minister of State with responsibility for Primary Care, Alex White TD, today (4 June 2014) concluded a series of meetings with the Irish Medical Organisation (IMO) with the signing of the Framework Agreement between the Minister of Health, the HSE and the Irish Medical Organisation (IMO) setting out a process for engagement concerning the GMS/GP contract and other publicly funded contracts involving General Practitioners (GPs). Download document here
Resumo:
Recent advances in signal analysis have engendered EEG with the status of a true brain mapping and brain imaging method capable of providing spatio-temporal information regarding brain (dys)function. Because of the increasing interest in the temporal dynamics of brain networks, and because of the straightforward compatibility of the EEG with other brain imaging techniques, EEG is increasingly used in the neuroimaging community. However, the full capability of EEG is highly underestimated. Many combined EEG-fMRI studies use the EEG only as a spike-counter or an oscilloscope. Many cognitive and clinical EEG studies use the EEG still in its traditional way and analyze grapho-elements at certain electrodes and latencies. We here show that this way of using the EEG is not only dangerous because it leads to misinterpretations, but it is also largely ignoring the spatial aspects of the signals. In fact, EEG primarily measures the electric potential field at the scalp surface in the same way as MEG measures the magnetic field. By properly sampling and correctly analyzing this electric field, EEG can provide reliable information about the neuronal activity in the brain and the temporal dynamics of this activity in the millisecond range. This review explains some of these analysis methods and illustrates their potential in clinical and experimental applications.
Resumo:
Foodborne disease is a source of increasing morbidity and fatality in the island of Ireland. It also has an economic impact. As a result of the continuing concern about food safety and its implications on an all-island basis, the North/South Ministerial Council established the Food Safety Promotion Board (FSPB) on December 2nd 1999. At its Board meeting in February 2000, the FSPB considered the issue of microbiological surveillance and, in noting the complexity of the issues, recommended that the key players in foodborne disease surveillance in Northern Ireland and the Republic of Ireland consider ways for the improvement of microbiological surveillance on an all-island basis. To assist in the development of a surveillance strategy for the FSPB, a Functional Meeting Group on Disease Surveillance was convened. The group compiled this consultation paper. A series of recommendations are made in the consultation paper and the key recommendations are summarised below.
Resumo:
In the course of its complex life cycle, the parasite Schistosoma mansoni need to adapt to distinct environments, and consequently is exposed to various DNA damaging agents. The Schistosoma genome sequencing initiative has uncovered sequences from genes and transcripts related to the process of DNA damage tolerance as the enzymes UBC13, MMS2, and RAD6. In the present work, we evaluate the importance of this process in different stages of the life cycle of this parasite. The importance is evidenced by expression and phylogenetic profiles, which show the conservation of this pathway from protozoa to mammalians on evolution.
Resumo:
This paper provides a preliminary formulation of a new currency based on knowledge. Through a literature review of alternative currencies, various properties and benefits are selected that we hope will enable such a currency to be created. Nowadays not only money but also knowledge is necessary to do business. For instance, knowledge about markets and consumers is highly valuable but difficult to achieve, and even more difficult to store, transport or trade. The basic premise of this proposal is a knowledge measurement pattern that is formulated as a new alternative social currency. Therefore, it is an additional means of contributing to the worldwide evolution of a knowledge society. It is intended as a currency to facilitate the conservation and storage of knowledge, and its organization and categorization, but mainly its exploitation and transference
Resumo:
BACKGROUND. Bioinformatics is commonly featured as a well assorted list of available web resources. Although diversity of services is positive in general, the proliferation of tools, their dispersion and heterogeneity complicate the integrated exploitation of such data processing capacity. RESULTS. To facilitate the construction of software clients and make integrated use of this variety of tools, we present a modular programmatic application interface (MAPI) that provides the necessary functionality for uniform representation of Web Services metadata descriptors including their management and invocation protocols of the services which they represent. This document describes the main functionality of the framework and how it can be used to facilitate the deployment of new software under a unified structure of bioinformatics Web Services. A notable feature of MAPI is the modular organization of the functionality into different modules associated with specific tasks. This means that only the modules needed for the client have to be installed, and that the module functionality can be extended without the need for re-writing the software client. CONCLUSIONS. The potential utility and versatility of the software library has been demonstrated by the implementation of several currently available clients that cover different aspects of integrated data processing, ranging from service discovery to service invocation with advanced features such as workflows composition and asynchronous services calls to multiple types of Web Services including those registered in repositories (e.g. GRID-based, SOAP, BioMOBY, R-bioconductor, and others).
Resumo:
To guarantee the success of a virtual library is essential that all users can access all the library resources independently of the user's location.