923 resultados para Search Engine Optimization Methods


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Long-term preservation of bioreporter bacteria is essential for the functioning of cell-based detection devices, particularly when field application, e.g., in developing countries, is intended. We varied the culture conditions (i.e., the NaCl content of the medium), storage protection media, and preservation methods (vacuum drying vs. encapsulation gels remaining hydrated) in order to achieve optimal preservation of the activity of As (III) bioreporter bacteria during up to 12 weeks of storage at 4 degrees C. The presence of 2% sodium chloride during the cultivation improved the response intensity of some bioreporters upon reconstitution, particularly of those that had been dried and stored in the presence of sucrose or trehalose and 10% gelatin. The most satisfying, stable response to arsenite after 12 weeks storage was obtained with cells that had been dried in the presence of 34% trehalose and 1.5% polyvinylpyrrolidone. Amendments of peptone, meat extract, sodium ascorbate, and sodium glutamate preserved the bioreporter activity only for the first 2 weeks, but not during long-term storage. Only short-term stability was also achieved when bioreporter bacteria were encapsulated in gels remaining hydrated during storage.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Debris accumulation on bridge piers is an on-going national problem that can obstruct the waterway openings at bridges and result in significant erosion of stream banks and scour at abutments and piers. In some cases, the accumulation of debris can adversely affect the operation of the waterway opening or cause failure of the structure. In addition, removal of debris accumulation is difficult, time consuming, and expensive for maintenance programs. This research involves a literature search of publications, products, and pier design recommendations that provide a cost effective method to mitigate debris accumulation at bridges. In addition, a nationwide survey was conducted to determine the state-of-the-practice and the results are presented within.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract : This work is concerned with the development and application of novel unsupervised learning methods, having in mind two target applications: the analysis of forensic case data and the classification of remote sensing images. First, a method based on a symbolic optimization of the inter-sample distance measure is proposed to improve the flexibility of spectral clustering algorithms, and applied to the problem of forensic case data. This distance is optimized using a loss function related to the preservation of neighborhood structure between the input space and the space of principal components, and solutions are found using genetic programming. Results are compared to a variety of state-of--the-art clustering algorithms. Subsequently, a new large-scale clustering method based on a joint optimization of feature extraction and classification is proposed and applied to various databases, including two hyperspectral remote sensing images. The algorithm makes uses of a functional model (e.g., a neural network) for clustering which is trained by stochastic gradient descent. Results indicate that such a technique can easily scale to huge databases, can avoid the so-called out-of-sample problem, and can compete with or even outperform existing clustering algorithms on both artificial data and real remote sensing images. This is verified on small databases as well as very large problems. Résumé : Ce travail de recherche porte sur le développement et l'application de méthodes d'apprentissage dites non supervisées. Les applications visées par ces méthodes sont l'analyse de données forensiques et la classification d'images hyperspectrales en télédétection. Dans un premier temps, une méthodologie de classification non supervisée fondée sur l'optimisation symbolique d'une mesure de distance inter-échantillons est proposée. Cette mesure est obtenue en optimisant une fonction de coût reliée à la préservation de la structure de voisinage d'un point entre l'espace des variables initiales et l'espace des composantes principales. Cette méthode est appliquée à l'analyse de données forensiques et comparée à un éventail de méthodes déjà existantes. En second lieu, une méthode fondée sur une optimisation conjointe des tâches de sélection de variables et de classification est implémentée dans un réseau de neurones et appliquée à diverses bases de données, dont deux images hyperspectrales. Le réseau de neurones est entraîné à l'aide d'un algorithme de gradient stochastique, ce qui rend cette technique applicable à des images de très haute résolution. Les résultats de l'application de cette dernière montrent que l'utilisation d'une telle technique permet de classifier de très grandes bases de données sans difficulté et donne des résultats avantageusement comparables aux méthodes existantes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Small RNAs (sRNAs) are widespread among bacteria and have diverse regulatory roles. Most of these sRNAs have been discovered by a combination of computational and experimental methods. In Pseudomonas aeruginosa, a ubiquitous Gram-negative bacterium and opportunistic human pathogen, the GacS/GacA two-component system positively controls the transcription of two sRNAs (RsmY, RsmZ), which are crucial for the expression of genes involved in virulence. In the biocontrol bacterium Pseudomonas fluorescens CHA0, three GacA-controlled sRNAs (RsmX, RsmY, RsmZ) regulate the response to oxidative stress and the expression of extracellular products including biocontrol factors. RsmX, RsmY and RsmZ contain multiple unpaired GGA motifs and control the expression of target mRNAs at the translational level, by sequestration of translational repressor proteins of the RsmA family. RESULTS: A combined computational and experimental approach enabled us to identify 14 intergenic regions encoding sRNAs in P. aeruginosa. Eight of these regions encode newly identified sRNAs. The intergenic region 1698 was found to specify a novel GacA-controlled sRNA termed RgsA. GacA regulation appeared to be indirect. In P. fluorescens CHA0, an RgsA homolog was also expressed under positive GacA control. This 120-nt sRNA contained a single GGA motif and, unlike RsmX, RsmY and RsmZ, was unable to derepress translation of the hcnA gene (involved in the biosynthesis of the biocontrol factor hydrogen cyanide), but contributed to the bacterium's resistance to hydrogen peroxide. In both P. aeruginosa and P. fluorescens the stress sigma factor RpoS was essential for RgsA expression. CONCLUSION: The discovery of an additional sRNA expressed under GacA control in two Pseudomonas species highlights the complexity of this global regulatory system and suggests that the mode of action of GacA control may be more elaborate than previously suspected. Our results also confirm that several GGA motifs are required in an sRNA for sequestration of the RsmA protein.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One major methodological problem in analysis of sequence data is the determination of costs from which distances between sequences are derived. Although this problem is currently not optimally dealt with in the social sciences, it has some similarity with problems that have been solved in bioinformatics for three decades. In this article, the authors propose an optimization of substitution and deletion/insertion costs based on computational methods. The authors provide an empirical way of determining costs for cases, frequent in the social sciences, in which theory does not clearly promote one cost scheme over another. Using three distinct data sets, the authors tested the distances and cluster solutions produced by the new cost scheme in comparison with solutions based on cost schemes associated with other research strategies. The proposed method performs well compared with other cost-setting strategies, while it alleviates the justification problem of cost schemes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: Many retinal degenerations result from defective retina-specific gene expressions. Thus, it is important to understand how the expression of a photoreceptor-specific gene is regulated in vivo in order to achieve successful gene therapy. The present study aims to design an AAV2/8 vector that can regulate the transcript level in a physiological manner to replace missing PDE6b in Rd1 and Rd10 mice. In previous studies (Ogieta, et al., 2000), the short 5' flanking sequence of the human PDE6b gene (350 bp) was shown to be photoreceptor-specific in transgenic mice. However, the efficiency and specificity of the 5' flanking region of the human PDE6b was not investigated in the context of gene therapy during retinal degeneration. In this study, two different sequences of the 5' flanking region of the human PDE6b gene were studied as promoter elements and their expression will be tested in wild type and diseased retinas (Rd 10 mice).Methods: Two 5' flanking fragments of the human PDE6b gene: (-93 to +53 (150 bp) and -297 to +53 (350 bp)) were cloned in different plasmids in order to check their expression in vitro and in vivo by constructing an AAV2/8 vector. These elements drove the activity of either luciferase (pGL3 plasmids) or EGFP. jetPEI transfection in Y 79 cells was used to evaluate gene expression through luciferase activity. Constructs encoding EGFP under the control of the two promoters were performed in AAV2.1-93 (or 297)-EGFP plasmids to produce AAV2/8 vectors.Results: When pGL3-93 (150 bp) or pGL3-297 (350 bp) were transfected in the Y-79 cells, the smaller fragment (150 bp) showed higher gene expression compared to the 350 bp element and to the SV40 control, as previously reported. The 350 bp drove similar levels of expression when compared to the SV40 promoter. In view of these results, the fragments (150 bp or 350 bp) were integrated into the AAV2.1-EGFP plasmid to produce AAV2/8 vector, and we are currently evaluating the efficiency and specificity of the produced constructs in vivo in normal and diseased retinas.Conclusions: Comparisons of these vectors with vectors bearing ubiquitous promoters should reveal which construct is the most suitable to drive efficient and specific gene expression in diseased retinas in order to restore a normal function on the long term.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A headspace solid-phase microextraction procedure (HS-SPME) was developed for the profiling of traces present in 3,4-methylenedioxymethylampethamine (MDMA). Traces were first extracted using HS-SPME and then analyzed by gas chromatography-mass spectroscopy (GC-MS). The HS-SPME conditions were optimized using varying conditions. Optimal results were obtained when 40 mg of crushed MDMA sample was heated at 80 °C for 15 min, followed by extraction at 80 °C for 15 min with a polydimethylsiloxane/divinylbenzene coated fibre. A total of 31 compounds were identified as traces related to MDMA synthesis, namely precursors, intermediates or by-products. In addition some fatty acids used as tabletting materials and caffeine used as adulterant, were also detected. The use of a restricted set of 10 target compounds was also proposed for developing a screening tool for clustering samples having close profile. 114 seizures were analyzed using an SPME auto-sampler (MultiPurpose Samples MPS2), purchased from Gerstel GMBH & Co. (Germany), and coupled to GC-MS. The data was handled using various pre-treatment methods, followed by the study of similarities between sample pairs based on the Pearson correlation. The results show that HS-SPME, coupled with the suitable statistical method is a powerful tool for distinguishing specimens coming from the same seizure and specimens coming from different seizures. This information can be used by law enforcement personnel to visualize the ecstasy distribution network as well as the clandestine tablet manufacturing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

3 Summary 3. 1 English The pharmaceutical industry has been facing several challenges during the last years, and the optimization of their drug discovery pipeline is believed to be the only viable solution. High-throughput techniques do participate actively to this optimization, especially when complemented by computational approaches aiming at rationalizing the enormous amount of information that they can produce. In siiico techniques, such as virtual screening or rational drug design, are now routinely used to guide drug discovery. Both heavily rely on the prediction of the molecular interaction (docking) occurring between drug-like molecules and a therapeutically relevant target. Several softwares are available to this end, but despite the very promising picture drawn in most benchmarks, they still hold several hidden weaknesses. As pointed out in several recent reviews, the docking problem is far from being solved, and there is now a need for methods able to identify binding modes with a high accuracy, which is essential to reliably compute the binding free energy of the ligand. This quantity is directly linked to its affinity and can be related to its biological activity. Accurate docking algorithms are thus critical for both the discovery and the rational optimization of new drugs. In this thesis, a new docking software aiming at this goal is presented, EADock. It uses a hybrid evolutionary algorithm with two fitness functions, in combination with a sophisticated management of the diversity. EADock is interfaced with .the CHARMM package for energy calculations and coordinate handling. A validation was carried out on 37 crystallized protein-ligand complexes featuring 11 different proteins. The search space was defined as a sphere of 15 R around the center of mass of the ligand position in the crystal structure, and conversely to other benchmarks, our algorithms was fed with optimized ligand positions up to 10 A root mean square deviation 2MSD) from the crystal structure. This validation illustrates the efficiency of our sampling heuristic, as correct binding modes, defined by a RMSD to the crystal structure lower than 2 A, were identified and ranked first for 68% of the complexes. The success rate increases to 78% when considering the five best-ranked clusters, and 92% when all clusters present in the last generation are taken into account. Most failures in this benchmark could be explained by the presence of crystal contacts in the experimental structure. EADock has been used to understand molecular interactions involved in the regulation of the Na,K ATPase, and in the activation of the nuclear hormone peroxisome proliferatoractivated receptors a (PPARa). It also helped to understand the action of common pollutants (phthalates) on PPARy, and the impact of biotransformations of the anticancer drug Imatinib (Gleevec®) on its binding mode to the Bcr-Abl tyrosine kinase. Finally, a fragment-based rational drug design approach using EADock was developed, and led to the successful design of new peptidic ligands for the a5ß1 integrin, and for the human PPARa. In both cases, the designed peptides presented activities comparable to that of well-established ligands such as the anticancer drug Cilengitide and Wy14,643, respectively. 3.2 French Les récentes difficultés de l'industrie pharmaceutique ne semblent pouvoir se résoudre que par l'optimisation de leur processus de développement de médicaments. Cette dernière implique de plus en plus. de techniques dites "haut-débit", particulièrement efficaces lorsqu'elles sont couplées aux outils informatiques permettant de gérer la masse de données produite. Désormais, les approches in silico telles que le criblage virtuel ou la conception rationnelle de nouvelles molécules sont utilisées couramment. Toutes deux reposent sur la capacité à prédire les détails de l'interaction moléculaire entre une molécule ressemblant à un principe actif (PA) et une protéine cible ayant un intérêt thérapeutique. Les comparatifs de logiciels s'attaquant à cette prédiction sont flatteurs, mais plusieurs problèmes subsistent. La littérature récente tend à remettre en cause leur fiabilité, affirmant l'émergence .d'un besoin pour des approches plus précises du mode d'interaction. Cette précision est essentielle au calcul de l'énergie libre de liaison, qui est directement liée à l'affinité du PA potentiel pour la protéine cible, et indirectement liée à son activité biologique. Une prédiction précise est d'une importance toute particulière pour la découverte et l'optimisation de nouvelles molécules actives. Cette thèse présente un nouveau logiciel, EADock, mettant en avant une telle précision. Cet algorithme évolutionnaire hybride utilise deux pressions de sélections, combinées à une gestion de la diversité sophistiquée. EADock repose sur CHARMM pour les calculs d'énergie et la gestion des coordonnées atomiques. Sa validation a été effectuée sur 37 complexes protéine-ligand cristallisés, incluant 11 protéines différentes. L'espace de recherche a été étendu à une sphère de 151 de rayon autour du centre de masse du ligand cristallisé, et contrairement aux comparatifs habituels, l'algorithme est parti de solutions optimisées présentant un RMSD jusqu'à 10 R par rapport à la structure cristalline. Cette validation a permis de mettre en évidence l'efficacité de notre heuristique de recherche car des modes d'interactions présentant un RMSD inférieur à 2 R par rapport à la structure cristalline ont été classés premier pour 68% des complexes. Lorsque les cinq meilleures solutions sont prises en compte, le taux de succès grimpe à 78%, et 92% lorsque la totalité de la dernière génération est prise en compte. La plupart des erreurs de prédiction sont imputables à la présence de contacts cristallins. Depuis, EADock a été utilisé pour comprendre les mécanismes moléculaires impliqués dans la régulation de la Na,K ATPase et dans l'activation du peroxisome proliferatoractivated receptor a (PPARa). Il a également permis de décrire l'interaction de polluants couramment rencontrés sur PPARy, ainsi que l'influence de la métabolisation de l'Imatinib (PA anticancéreux) sur la fixation à la kinase Bcr-Abl. Une approche basée sur la prédiction des interactions de fragments moléculaires avec protéine cible est également proposée. Elle a permis la découverte de nouveaux ligands peptidiques de PPARa et de l'intégrine a5ß1. Dans les deux cas, l'activité de ces nouveaux peptides est comparable à celles de ligands bien établis, comme le Wy14,643 pour le premier, et le Cilengitide (PA anticancéreux) pour la seconde.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mixture materials, mix design, and pavement construction are not isolated steps in the concrete paving process. Each affects the other in ways that determine overall pavement quality and long-term performance. However, equipment and procedures commonly used to test concrete materials and concrete pavements have not changed in decades, leaving gaps in our ability to understand and control the factors that determine concrete durability. The concrete paving community needs tests that will adequately characterize the materials, predict interactions, and monitor the properties of the concrete. The overall objectives of this study are (1) to evaluate conventional and new methods for testing concrete and concrete materials to prevent material and construction problems that could lead to premature concrete pavement distress and (2) to examine and refine a suite of tests that can accurately evaluate concrete pavement properties. The project included three phases. In Phase I, the research team contacted each of 16 participating states to gather information about concrete and concrete material tests. A preliminary suite of tests to ensure long-term pavement performance was developed. The tests were selected to provide useful and easy-to-interpret results that can be performed reasonably and routinely in terms of time, expertise, training, and cost. The tests examine concrete pavement properties in five focal areas critical to the long life and durability of concrete pavements: (1) workability, (2) strength development, (3) air system, (4) permeability, and (5) shrinkage. The tests were relevant at three stages in the concrete paving process: mix design, preconstruction verification, and construction quality control. In Phase II, the research team conducted field testing in each participating state to evaluate the preliminary suite of tests and demonstrate the testing technologies and procedures using local materials. A Mobile Concrete Research Lab was designed and equipped to facilitate the demonstrations. This report documents the results of the 16 state projects. Phase III refined and finalized lab and field tests based on state project test data. The results of the overall project are detailed herein. The final suite of tests is detailed in the accompanying testing guide.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As modern molecular biology moves towards the analysis of biological systems as opposed to their individual components, the need for appropriate mathematical and computational techniques for understanding the dynamics and structure of such systems is becoming more pressing. For example, the modeling of biochemical systems using ordinary differential equations (ODEs) based on high-throughput, time-dense profiles is becoming more common-place, which is necessitating the development of improved techniques to estimate model parameters from such data. Due to the high dimensionality of this estimation problem, straight-forward optimization strategies rarely produce correct parameter values, and hence current methods tend to utilize genetic/evolutionary algorithms to perform non-linear parameter fitting. Here, we describe a completely deterministic approach, which is based on interval analysis. This allows us to examine entire sets of parameters, and thus to exhaust the global search within a finite number of steps. In particular, we show how our method may be applied to a generic class of ODEs used for modeling biochemical systems called Generalized Mass Action Models (GMAs). In addition, we show that for GMAs our method is amenable to the technique in interval arithmetic called constraint propagation, which allows great improvement of its efficiency. To illustrate the applicability of our method we apply it to some networks of biochemical reactions appearing in the literature, showing in particular that, in addition to estimating system parameters in the absence of noise, our method may also be used to recover the topology of these networks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Health professionals and policymakers aspire to make healthcare decisions based on the entire relevant research evidence. This, however, can rarely be achieved because a considerable amount of research findings are not published, especially in case of 'negative' results - a phenomenon widely recognized as publication bias. Different methods of detecting, quantifying and adjusting for publication bias in meta-analyses have been described in the literature, such as graphical approaches and formal statistical tests to detect publication bias, and statistical approaches to modify effect sizes to adjust a pooled estimate when the presence of publication bias is suspected. An up-to-date systematic review of the existing methods is lacking. METHODS/DESIGN: The objectives of this systematic review are as follows:âeuro¢ To systematically review methodological articles which focus on non-publication of studies and to describe methods of detecting and/or quantifying and/or adjusting for publication bias in meta-analyses.âeuro¢ To appraise strengths and weaknesses of methods, the resources they require, and the conditions under which the method could be used, based on findings of included studies.We will systematically search Web of Science, Medline, and the Cochrane Library for methodological articles that describe at least one method of detecting and/or quantifying and/or adjusting for publication bias in meta-analyses. A dedicated data extraction form is developed and pilot-tested. Working in teams of two, we will independently extract relevant information from each eligible article. As this will be a qualitative systematic review, data reporting will involve a descriptive summary. DISCUSSION: Results are expected to be publicly available in mid 2013. This systematic review together with the results of other systematic reviews of the OPEN project (To Overcome Failure to Publish Negative Findings) will serve as a basis for the development of future policies and guidelines regarding the assessment and handling of publication bias in meta-analyses.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Työn tarkoituksena oli testata jo tutkimuskeskuksella käytössä ollutta ja tutkimuskeskukselle tässä työssä kehitettyä pakkauksen vesihöyrytiiveyteen liittyvää mittausmenetelmää. Saatuja tuloksia verrattiin keskenään sekä materiaalista mitattuihin arvoihin. Elintarvikepakkauksia tutkittiin myös kosteussensoreiden, säilyvyyskokeen sekä kuljetussimuloinnin avulla. Optimoinnilla tutkittiin pakkauksen muodon vaikutusta vesihöyrytiiveyteen. Pakkauksen vesihöyrynläpäisyn mittaamiseen kehitetty menetelmä toimi hyvin ja sen toistettavuus oli hyvä. Verrattaessa sitä jo olemassa olleeseen menetelmään tulokseksi saatiin, että uusi menetelmä oli nopeampi ja vaati vähemmän työaikaa, mutta molemmat menetelmät antoivat hyviä arvoja rinnakkaisille näytteille. Kosteussensoreilla voitiin tutkia tyhjän pakkauksen sisällä olevan kosteuden muutoksia säilytyksen aikana. Säilyvyystesti tehtiin muroilla ja parhaan vesihöyrysuojan antoivat pakkaukset joissa oli alumiinilaminaatti- tai metalloitu OPP kerros. Kuljetustestauksen ensimmäisessä testissä pakkauksiin pakattiin muroja ja toisessa testissä nuudeleita. Kuljetussimuloinnilla ei ollutvaikutusta pakkausten sisäpintojen eheyteen eikä siten pakkausten vesihöyrytiiveyteen. Optimoinnilla vertailtiin eri muotoisten pakkausten tilavuus/pinta-ala suhdetta ja vesihöyrytiiveyden riippuvuutta pinta-alasta. Optimaalisimmaksi pakkaukseksi saatiin pallo, jonka pinta-ala oli pienin ja materiaalin sallima vesihöyrynläpäisy suurin ja vesihöyrybarrierin määrä pienin.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this thesis, cleaning of ceramic filter media was studied. Mechanisms of fouling and dissolution of iron compounds, as well as methods for cleaning ceramic membranes fouled by iron deposits were studied in the literature part. Cleaning agents and different methods were closer examined in the experimental part of the thesis. Pyrite is found in the geologic strata. It is oxidized to form ferrous ions Fe(II) and ferric ions Fe(III). Fe(III) is further oxidized in the hydrolysis to form ferric hydroxide. Hematite and goethite, for instance, are naturally occurring iron oxidesand hydroxides. In contact with filter media, they can cause severe fouling, which common cleaning techniques competent enough to remove. Mechanisms for the dissolution of iron oxides include the ligand-promoted pathway and the proton-promoted pathway. The dissolution can also be reductive or non-reductive. The most efficient mechanism is the ligand-promoted reductive mechanism that comprises two stages: the induction period and the autocatalytic dissolution.Reducing agents(such as hydroquinone and hydroxylamine hydrochloride), chelating agents (such as EDTA) and organic acids are used for the removal of iron compounds. Oxalic acid is the most effective known cleaning agent for iron deposits. Since formulations are often more effective than organic acids, reducing agents or chelating agents alone, the citrate¿bicarbonate¿dithionite system among others is well studied in the literature. The cleaning is also enhanced with ultrasound and backpulsing.In the experimental part, oxalic acid and nitric acid were studied alone andin combinations. Also citric acid and ascorbic acid among other chemicals were tested. Soaking experiments, experiments with ultrasound and experiments for alternative methods to apply the cleaning solution on the filter samples were carried out. Permeability and ISO Brightness measurements were performed to examine the influence of the cleaning methods on the samples. Inductively coupled plasma optical emission spectroscopy (ICP-OES) analysis of the solutions was carried out to determine the dissolved metals.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose The purpose of our multidisciplinary study was to define a pragmatic and secure alternative to the creation of a national centralised medical record which could gather together the different parts of the medical record of a patient scattered in the different hospitals where he was hospitalised without any risk of breaching confidentiality. Methods We first analyse the reasons for the failure and the dangers of centralisation (i.e. difficulty to define a European patients' identifier, to reach a common standard for the contents of the medical record, for data protection) and then propose an alternative that uses the existing available data on the basis that setting up a safe though imperfect system could be better than continuing a quest for a mythical perfect information system that we have still not found after a search that has lasted two decades. Results We describe the functioning of Medical Record Search Engines (MRSEs), using pseudonymisation of patients' identity. The MRSE will be able to retrieve and to provide upon an MD's request all the available information concerning a patient who has been hospitalised in different hospitals without ever having access to the patient's identity. The drawback of this system is that the medical practitioner then has to read all of the information and to create his own synthesis and eventually to reject extra data. Conclusions Faced with the difficulties and the risks of setting up a centralised medical record system, a system that gathers all of the available information concerning a patient could be of great interest. This low-cost pragmatic alternative which could be developed quickly should be taken into consideration by health authorities.