949 resultados para datasets storage and regeneration
Resumo:
Adrenocortical cell nuclei of the dormouse Muscardinus avellanarius were investigated by electron microscopic immunocytochemistry in hibernating, arousing and euthermic individuals. While the basic structural constituents of the cell nucleus did not significantly modify in the three groups, novel structural components were found in nuclei of hibernating dormice. Lattice-like bodies (LBs), clustered granules (CGs), fibrogranular material (FGM) and granules associated with bundles of nucleoplasmic fibrils (NF) all contained ribonucleoproteins (RNPs), as shown by labeling with anti-snRNP (small nuclear RNP), anti-m3G-capped RNA and anti-hnRNP (heterogeneous nuclear RNP) antibodies. Moreover, the FGM also showed immunoreactivity for the proliferation associated nuclear antigen (PANA) and the non-snRNP splicing factor SC-35. All these nuclear structural components disappeared early during arousal and were not found in euthermic animals. These novel RNP-containing structures, which have not been observed in other tissues investigated so far in the same animal model, could represent storage and/or processing sites for pre-mRNA during the extreme metabolic condition of hibernation, to be quickly released upon arousal. NFs, which had been sometimes found devoid of associated granules in nuclei of brown adipose tissue from hi-bernating dormice, were present in much higher amounts in adrenocortical cell nuclei; they do not contain RNPs and their role remains to be elucidated. The possible roles of these structures are discussed in the frame of current knowledge of morpho-functional relationships in the cell nucleus.
Resumo:
Irrigation with treated domestic sewage wastewater (TSE) is an agricultural practice to reduce water requirements of agroecossystems and the nutrient load impact on freshwaters, but adverse effects on soil chemical (salinization, sodification, etc.) and soil physical properties (alteration in soil porosity and hydraulic conductivity, etc.) have been reported. This study aimed to define some relationships among these changes in an Oxisol using multivariate analysis. Corn (Zea mays L.) and sunflower (Helianthus annuus L.) were grown for two years, irrigated with TSE. The following soil properties were determined: Ca2+; Mg2+; Na+; K+ and H + Al contents, cationic exchangeable capacity (CEC), sum of bases (SB), base saturation (V), texture (sand, silt and clay), macro-, micro-, and cryptoporosity (V MA, V MI and V CRI), water content at soil saturation (θS) and at field capacity (θFC), residual water content (θR), soil bulk density (d s), water dispersed clay (WDC) and saturated hydraulic conductivity (K SAT). Factor analysis revealed the following six principal factors: Fine Porosity (composed of Na+; K+; WDC, θR, θRFC, and V CRI); Large Porosity (θS, d s, V MA, Vs); Soil CEC (Ca2+; Mg2+; CEC, SB, V); Soil Acidity (H + Al); and Soil Texture (factors 5 and 6). A dual pore structure appears clearly to the factors 1 and 2, with an apparent relationship between fine porosity and the monovalent cations Na+ and K+. The irrigation (with potable sodic tap water or sewage wastewater) only had a significant effect on Fine Porosity and Large Porosity factors, while factors 3 and 4 (Soil CEC and Soil Acidity) were correlated with soil depth. The main conclusion was a shift in pore distribution (large to fine pores) during irrigation with TSE, which induces an increase of water storage and reduces the capacity of drainage of salts.
Resumo:
The current research aims to introduce Layered Double Hydroxides (LDH) as nanomaterials to be used in agriculture, with particular reference to its use as storage and slow release matrix of nutrients and agrochemicals for plant growing. Structural characteristics, main properties, synthesis methods and characterization of LDH were covered in this study. Moreover, some literature data have been reported to demonstrate their potential for storage and slow release of nitrate, phosphate, agrochemicals, besides as being used as adsorbent for the wastewater treatment. This research aims to expand, in near future, the investigation field on these materials, with application in agriculture, increasing the interface between chemistry and agronomy.
Resumo:
Résumé Suite aux recentes avancées technologiques, les archives d'images digitales ont connu une croissance qualitative et quantitative sans précédent. Malgré les énormes possibilités qu'elles offrent, ces avancées posent de nouvelles questions quant au traitement des masses de données saisies. Cette question est à la base de cette Thèse: les problèmes de traitement d'information digitale à très haute résolution spatiale et/ou spectrale y sont considérés en recourant à des approches d'apprentissage statistique, les méthodes à noyau. Cette Thèse étudie des problèmes de classification d'images, c'est à dire de catégorisation de pixels en un nombre réduit de classes refletant les propriétés spectrales et contextuelles des objets qu'elles représentent. L'accent est mis sur l'efficience des algorithmes, ainsi que sur leur simplicité, de manière à augmenter leur potentiel d'implementation pour les utilisateurs. De plus, le défi de cette Thèse est de rester proche des problèmes concrets des utilisateurs d'images satellite sans pour autant perdre de vue l'intéret des méthodes proposées pour le milieu du machine learning dont elles sont issues. En ce sens, ce travail joue la carte de la transdisciplinarité en maintenant un lien fort entre les deux sciences dans tous les développements proposés. Quatre modèles sont proposés: le premier répond au problème de la haute dimensionalité et de la redondance des données par un modèle optimisant les performances en classification en s'adaptant aux particularités de l'image. Ceci est rendu possible par un système de ranking des variables (les bandes) qui est optimisé en même temps que le modèle de base: ce faisant, seules les variables importantes pour résoudre le problème sont utilisées par le classifieur. Le manque d'information étiquétée et l'incertitude quant à sa pertinence pour le problème sont à la source des deux modèles suivants, basés respectivement sur l'apprentissage actif et les méthodes semi-supervisées: le premier permet d'améliorer la qualité d'un ensemble d'entraînement par interaction directe entre l'utilisateur et la machine, alors que le deuxième utilise les pixels non étiquetés pour améliorer la description des données disponibles et la robustesse du modèle. Enfin, le dernier modèle proposé considère la question plus théorique de la structure entre les outputs: l'intègration de cette source d'information, jusqu'à présent jamais considérée en télédétection, ouvre des nouveaux défis de recherche. Advanced kernel methods for remote sensing image classification Devis Tuia Institut de Géomatique et d'Analyse du Risque September 2009 Abstract The technical developments in recent years have brought the quantity and quality of digital information to an unprecedented level, as enormous archives of satellite images are available to the users. However, even if these advances open more and more possibilities in the use of digital imagery, they also rise several problems of storage and treatment. The latter is considered in this Thesis: the processing of very high spatial and spectral resolution images is treated with approaches based on data-driven algorithms relying on kernel methods. In particular, the problem of image classification, i.e. the categorization of the image's pixels into a reduced number of classes reflecting spectral and contextual properties, is studied through the different models presented. The accent is put on algorithmic efficiency and the simplicity of the approaches proposed, to avoid too complex models that would not be used by users. The major challenge of the Thesis is to remain close to concrete remote sensing problems, without losing the methodological interest from the machine learning viewpoint: in this sense, this work aims at building a bridge between the machine learning and remote sensing communities and all the models proposed have been developed keeping in mind the need for such a synergy. Four models are proposed: first, an adaptive model learning the relevant image features has been proposed to solve the problem of high dimensionality and collinearity of the image features. This model provides automatically an accurate classifier and a ranking of the relevance of the single features. The scarcity and unreliability of labeled. information were the common root of the second and third models proposed: when confronted to such problems, the user can either construct the labeled set iteratively by direct interaction with the machine or use the unlabeled data to increase robustness and quality of the description of data. Both solutions have been explored resulting into two methodological contributions, based respectively on active learning and semisupervised learning. Finally, the more theoretical issue of structured outputs has been considered in the last model, which, by integrating outputs similarity into a model, opens new challenges and opportunities for remote sensing image processing.
Resumo:
The Complete Arabidopsis Transcriptome Micro Array (CATMA) database contains gene sequence tag (GST) and gene model sequences for over 70% of the predicted genes in the Arabidopsis thaliana genome as well as primer sequences for GST amplification and a wide range of supplementary information. All CATMA GST sequences are specific to the gene for which they were designed, and all gene models were predicted from a complete reannotation of the genome using uniform parameters. The database is searchable by sequence name, sequence homology or direct SQL query, and is available through the CATMA website at http://www.catma.org/.
Resumo:
Thyroid hormones, which play an important role in the development and regeneration of the nervous system, require the presence of specific nuclear T3 receptors (NT3R). In this study we provide evidence that NT3R expression by Schwann cells was up-regulated in response to a loss of axonal contact in vitro and in vivo. In dorsal root ganglia explant cultures, Schwann cells which accompanied axons (nerve fibres) were devoid of NT3R. When Schwann cells were orphaned from axon contact by axon transection, all the nuclei of these cells displayed NT3R immunoreactivity. Similar results were obtained in situ; in adult rat sciatic nerve, Schwann cells which ensheathed healthy axons never expressed NT3R immunoreactivity. After sciatic nerve transection in vivo the nuclei of Schwann cells deprived of axonal contact displayed a clear NT3R immunoreaction.
Resumo:
Background In recent years, planaria have emerged as an important model system for research into stem cells and regeneration. Attention is focused on their unique stem cells, the neoblasts, which can differentiate into any cell type present in the adult organism. Sequencing of the Schmidtea mediterranea genome and some expressed sequence tag projects have generated extensive data on the genetic profile of these cells. However, little information is available on their protein dynamics. Results We developed a proteomic strategy to identify neoblast-specific proteins. Here we describe the method and discuss the results in comparison to the genomic high-throughput analyses carried out in planaria and to proteomic studies using other stem cell systems. We also show functional data for some of the candidate genes selected in our proteomic approach. Conclusions We have developed an accurate and reliable mass-spectra-based proteomics approach to complement previous genomic studies and to further achieve a more accurate understanding and description of the molecular and cellular processes related to the neoblasts.
Online teaching of inflammatory skin pathology by a French-speaking international university network
Resumo:
Introduction: Developments in technology, webbased teaching and whole slide imaging have broadened the teaching horizon in anatomic pathology. Creating online learning material including many types of media like radiologic images, videos, clinical and macroscopic photographs and whole slides imaging is now accessible to almost every university. Unfortunately, a major limiting factor to maintain and update the learning material is the amount of work, time and resources needed. In this perspective, a French national university network was initiated in 2011 to build mutualised online teaching pathology modules with clinical cases and tests. This network has been extended to an international level in 2012-2014 (Quebec, Switzerland and Ivory Coast). Method: One of the first steps of the international project was to build a learning module on inflammatory skin pathology intended for interns and residents of pathology and dermatology. A pathology resident from Quebec spent 6 weeks in France and Switzerland to develop the contents and build the module on an e-learning Moodle platform (http: //moodle.sorbonne-paris-cite.fr) under the supervision of two dermatopathologists (BV, MB). The learning module contains text, interactive clinical cases, tests with feedback, whole slides images (WSI), images and clinical photographs. For that module, the virtual slides are decentralized in 2 universities (Bordeaux and Paris 7). Each university is responsible of its own slide scanning, image storage and online display with virtual slide viewers. Results: The module on inflammatory skin pathology includes more than 50 web pages with French original content, tests and clinical cases, links to over 45 WSI and more than 50 micro and clinical photographs. The whole learning module is currently being revised by four dermatopathologists and two senior pathologists. It will be accessible to interns and residents in spring 2014. The experience and knowledge gained from that work will be transferred to the next international fellowship intern whose work will be aimed at creating lung and breast pathology learning modules. Conclusion: The challenges of sustaining a project of this scope are numerous. The technical aspect of whole-slide imaging and storage needs to be developed by each university or group. The content needs to be regularly updated, completed and its use and existence needs to be promoted by the different actors in pathology. Of the great benefits of that kind of project are the international partnerships and connections that have been established between numerous Frenchspeaking universities and pathologists with the common goals of promoting education in pathology and the use of technology including whole slide imaging. * The Moodle website is hosted by PRES Sorbonne Paris Cité, and financial supports for hardware have been obtained from UNF3S (http://www.unf3s.org/) and PRES Sorbonne Paris Cité. Financial support for international fellowships has been obtained from CFQCU (http://www.cfqcu.org/).
Resumo:
Introduction: Therapeutic drug monitoring (TDM) aims at optimizing treatment by individualizing dosage regimen based on measurement of blood concentrations. Maintaining concentrations within a target range requires pharmacokinetic and clinical capabilities. Bayesian calculation represents a gold standard in TDM approach but requires computing assistance. In the last decades computer programs have been developed to assist clinicians in this assignment. The aim of this benchmarking was to assess and compare computer tools designed to support TDM clinical activities.¦Method: Literature and Internet search was performed to identify software. All programs were tested on common personal computer. Each program was scored against a standardized grid covering pharmacokinetic relevance, user-friendliness, computing aspects, interfacing, and storage. A weighting factor was applied to each criterion of the grid to consider its relative importance. To assess the robustness of the software, six representative clinical vignettes were also processed through all of them.¦Results: 12 software tools were identified, tested and ranked. It represents a comprehensive review of the available software's characteristics. Numbers of drugs handled vary widely and 8 programs offer the ability to the user to add its own drug model. 10 computer programs are able to compute Bayesian dosage adaptation based on a blood concentration (a posteriori adjustment) while 9 are also able to suggest a priori dosage regimen (prior to any blood concentration measurement), based on individual patient covariates, such as age, gender, weight. Among those applying Bayesian analysis, one uses the non-parametric approach. The top 2 software emerging from this benchmark are MwPharm and TCIWorks. Other programs evaluated have also a good potential but are less sophisticated (e.g. in terms of storage or report generation) or less user-friendly.¦Conclusion: Whereas 2 integrated programs are at the top of the ranked listed, such complex tools would possibly not fit all institutions, and each software tool must be regarded with respect to individual needs of hospitals or clinicians. Interest in computing tool to support therapeutic monitoring is still growing. Although developers put efforts into it the last years, there is still room for improvement, especially in terms of institutional information system interfacing, user-friendliness, capacity of data storage and report generation.
Resumo:
Objectives: Therapeutic drug monitoring (TDM) aims at optimizing treatment by individualizing dosage regimen based on blood concentrations measurement. Maintaining concentrations within a target range requires pharmacokinetic (PK) and clinical capabilities. Bayesian calculation represents a gold standard in TDM approach but requires computing assistance. The aim of this benchmarking was to assess and compare computer tools designed to support TDM clinical activities.¦Methods: Literature and Internet were searched to identify software. Each program was scored against a standardized grid covering pharmacokinetic relevance, user-friendliness, computing aspects, interfacing, and storage. A weighting factor was applied to each criterion of the grid to consider its relative importance. To assess the robustness of the software, six representative clinical vignettes were also processed through all of them.¦Results: 12 software tools were identified, tested and ranked. It represents a comprehensive review of the available software characteristics. Numbers of drugs handled vary from 2 to more than 180, and integration of different population types is available for some programs. Nevertheless, 8 programs offer the ability to add new drug models based on population PK data. 10 computer tools incorporate Bayesian computation to predict dosage regimen (individual parameters are calculated based on population PK models). All of them are able to compute Bayesian a posteriori dosage adaptation based on a blood concentration while 9 are also able to suggest a priori dosage regimen, only based on individual patient covariates. Among those applying Bayesian analysis, MM-USC*PACK uses a non-parametric approach. The top 2 programs emerging from this benchmark are MwPharm and TCIWorks. Others programs evaluated have also a good potential but are less sophisticated or less user-friendly.¦Conclusions: Whereas 2 software packages are ranked at the top of the list, such complex tools would possibly not fit all institutions, and each program must be regarded with respect to individual needs of hospitals or clinicians. Programs should be easy and fast for routine activities, including for non-experienced users. Although interest in TDM tools is growing and efforts were put into it in the last years, there is still room for improvement, especially in terms of institutional information system interfacing, user-friendliness, capability of data storage and automated report generation.
Resumo:
Les deux premières parties de cet article parues précédemment ont présenté la méthodologie ainsi que les premiers éléments du bilan réalisé sur la période allant de 2009 à 2012 de la veille bibliographique sur la surveillance biologique de l'exposition aux produits chimiques en milieu de travail (SBEPC MT) mise en place par un réseau francophone multidisciplinaire.
Resumo:
Drilled shafts have been used in the US for more than 100 years in bridges and buildings as a deep foundation alternative. For many of these applications, the drilled shafts were designed using the Working Stress Design (WSD) approach. Even though WSD has been used successfully in the past, a move toward Load Resistance Factor Design (LRFD) for foundation applications began when the Federal Highway Administration (FHWA) issued a policy memorandum on June 28, 2000.The policy memorandum requires all new bridges initiated after October 1, 2007, to be designed according to the LRFD approach. This ensures compatibility between the superstructure and substructure designs, and provides a means of consistently incorporating sources of uncertainty into each load and resistance component. Regionally-calibrated LRFD resistance factors are permitted by the American Association of State Highway and Transportation Officials (AASHTO) to improve the economy and competitiveness of drilled shafts. To achieve this goal, a database for Drilled SHAft Foundation Testing (DSHAFT) has been developed. DSHAFT is aimed at assimilating high quality drilled shaft test data from Iowa and the surrounding regions, and identifying the need for further tests in suitable soil profiles. This report introduces DSHAFT and demonstrates its features and capabilities, such as an easy-to-use storage and sharing tool for providing access to key information (e.g., soil classification details and cross-hole sonic logging reports). DSHAFT embodies a model for effective, regional LRFD calibration procedures consistent with PIle LOad Test (PILOT) database, which contains driven pile load tests accumulated from the state of Iowa. PILOT is now available for broader use at the project website: http://srg.cce.iastate.edu/lrfd/. DSHAFT, available in electronic form at http://srg.cce.iastate.edu/dshaft/, is currently comprised of 32 separate load tests provided by Illinois, Iowa, Minnesota, Missouri and Nebraska state departments of transportation and/or department of roads. In addition to serving as a manual for DSHAFT and providing a summary of the available data, this report provides a preliminary analysis of the load test data from Iowa, and will open up opportunities for others to share their data through this quality–assured process, thereby providing a platform to improve LRFD approach to drilled shafts, especially in the Midwest region.
Resumo:
PURPOSE: To improve the tag persistence throughout the whole cardiac cycle by providing a constant tag-contrast throughout all the cardiac phases when using balanced steady-state free precession (bSSFP) imaging. MATERIALS AND METHODS: The flip angles of the imaging radiofrequency pulses were optimized to compensate for the tagging contrast-to-noise ratio (Tag-CNR) fading at later cardiac phases in bSSFP imaging. Complementary spatial modulation of magnetization (CSPAMM) tagging was implemented to improve the Tag-CNR. Numerical simulations were performed to examine the behavior of the Tag-CNR with the proposed method, and to compare the resulting Tag-CNR with that obtained from the more commonly used spoiled gradient echo (SPGR) imaging. A gel phantom, as well as five healthy human volunteers, were scanned on a 1.5T scanner using bSSFP imaging with and without the proposed technique. The phantom was also scanned with SPGR imaging. RESULTS: With the proposed technique, the Tag-CNR remained almost constant during the whole cardiac cycle. Using bSSFP imaging, the Tag-CNR was about double that of SPGR. CONCLUSION: The tag persistence was significantly improved when the proposed method was applied, with better Tag-CNR during the diastolic cardiac phase. The improved Tag-CNR will support automated tagging analysis and quantification methods.
Resumo:
L'article descriu l'organització de Redinet (Red Estatal de Bases de Dades Educatives), el procés de creació de la capcelera de Catalunya, així com el contingut i l'estructura de les seves tres bases de dades: investigació, innovació i recursos, detallant-se les vies d'accés i consulta d'aquestes.
Resumo:
This report is a culmination of the Location Referencing System (LRS) team. The team was charged with defining a system that coordinates the collection storage, and access to location referencing information by developing an LRS to be used throughout the Iowa DOT.