31 resultados para RDF,Named Graphs,Provenance,Semantic Web,Semantics
Resumo:
Most life science processes involve, at the atomic scale, recognition between two molecules. The prediction of such interactions at the molecular level, by so-called docking software, is a non-trivial task. Docking programs have a wide range of applications ranging from protein engineering to drug design. This article presents SwissDock, a web server dedicated to the docking of small molecules on target proteins. It is based on the EADock DSS engine, combined with setup scripts for curating common problems and for preparing both the target protein and the ligand input files. An efficient Ajax/HTML interface was designed and implemented so that scientists can easily submit dockings and retrieve the predicted complexes. For automated docking tasks, a programmatic SOAP interface has been set up and template programs can be downloaded in Perl, Python and PHP. The web site also provides an access to a database of manually curated complexes, based on the Ligand Protein Database. A wiki and a forum are available to the community to promote interactions between users. The SwissDock web site is available online at http://www.swissdock.ch. We believe it constitutes a step toward generalizing the use of docking tools beyond the traditional molecular modeling community.
Resumo:
Internet is increasingly used as a source of information on health issues and is probably a major source of patients' empowerment. This process is however limited by the frequently poor quality of web-based health information designed for consumers. A better diffusion of information about criteria defining the quality of the content of websites, and about useful methods designed for searching such needed information, could be particularly useful to patients and their relatives. A brief, six-items DISCERN version, characterized by a high specificity for detecting websites with good or very good content quality was recently developed. This tool could facilitate the identification of high-quality information on the web by patients and may improve the empowerment process initiated by the development of the health-related web.
Resumo:
QUESTIONS UNDER STUDY: Our aim was to identify the barriers young men face to consult a health professional when they encounter sexual dysfunctions and where they turn to, if so, for answers. METHODS: We conducted an exploratory qualitative research including 12 young men aged 16-20 years old seen in two focus groups. Discussions were triggered through vignettes about sexual dysfunction. RESULTS: Young men preferred not to talk about sexual dysfunction problems with anyone and to solve them alone as it is considered an intimate and embarrassing subject which can negatively impact their masculinity. Confidentiality appeared to be the most important criterion in disclosing an intimate subject to a health professional. Participants raised the problem of males' accessibility to services and lack of reason to consult. Two criteria to address the problem were if it was long-lasting or considered as physical. The Internet was unanimously considered as an initial solution to solve a problem, which could guide them to a face-to-face consultation if necessary. CONCLUSIONS: Results suggest that Internet-based tools should be developed to become an easy access door to sexual health services for young men. Wherever they consult and for whatever problem, sexual health must be on the agenda.
Resumo:
BACKGROUND: The Internet is increasingly used as a source of information for mental health issues. The burden of obsessive compulsive disorder (OCD) may lead persons with diagnosed or undiagnosed OCD, and their relatives, to search for good quality information on the Web. This study aimed to evaluate the quality of Web-based information on English-language sites dealing with OCD and to compare the quality of websites found through a general and a medically specialized search engine. METHODS: Keywords related to OCD were entered into Google and OmniMedicalSearch. Websites were assessed on the basis of accountability, interactivity, readability, and content quality. The "Health on the Net" (HON) quality label and the Brief DISCERN scale score were used as possible content quality indicators. Of the 235 links identified, 53 websites were analyzed. RESULTS: The content quality of the OCD websites examined was relatively good. The use of a specialized search engine did not offer an advantage in finding websites with better content quality. A score ≥16 on the Brief DISCERN scale is associated with better content quality. CONCLUSION: This study shows the acceptability of the content quality of OCD websites. There is no advantage in searching for information with a specialized search engine rather than a general one. Practical implications: The Internet offers a number of high quality OCD websites. It remains critical, however, to have a provider-patient talk about the information found on the Web.
Archiver le Web sur les migrations : quelles approches techniques et scientifiques ? Rapport d'étape
Resumo:
A 57-year-old male with no family history was diagnosed with semantic dementia. He also showed some unusual cognitive features such as episodic memory and executive dysfunctions, spatial disorientation, and dyscalculia. Rapidly progressive cognitive and physical decline occurred. About 1.5 years later, he developed clinical features of a corticobasal syndrome. He died at the age of 60. Brain autopsy revealed numerous 4R-tau-positive lesions in the frontal, parietal and temporal lobes, basal ganglia, and brainstem. Neuronal loss was severe in the temporal cortex. Such association of semantic dementia with tauopathy and corticobasal syndrome is highly unusual. These findings are discussed in the light of current knowledge about frontotemporal lobar degeneration.
Resumo:
The international Functional Annotation Of the Mammalian Genomes 4 (FANTOM4) research collaboration set out to better understand the transcriptional network that regulates macrophage differentiation and to uncover novel components of the transcriptome employing a series of high-throughput experiments. The primary and unique technique is cap analysis of gene expression (CAGE), sequencing mRNA 5'-ends with a second-generation sequencer to quantify promoter activities even in the absence of gene annotation. Additional genome-wide experiments complement the setup including short RNA sequencing, microarray gene expression profiling on large-scale perturbation experiments and ChIP-chip for epigenetic marks and transcription factors. All the experiments are performed in a differentiation time course of the THP-1 human leukemic cell line. Furthermore, we performed a large-scale mammalian two-hybrid (M2H) assay between transcription factors and monitored their expression profile across human and mouse tissues with qRT-PCR to address combinatorial effects of regulation by transcription factors. These interdependent data have been analyzed individually and in combination with each other and are published in related but distinct papers. We provide all data together with systematic annotation in an integrated view as resource for the scientific community (http://fantom.gsc.riken.jp/4/). Additionally, we assembled a rich set of derived analysis results including published predicted and validated regulatory interactions. Here we introduce the resource and its update after the initial release.
Resumo:
Bioactive small molecules, such as drugs or metabolites, bind to proteins or other macro-molecular targets to modulate their activity, which in turn results in the observed phenotypic effects. For this reason, mapping the targets of bioactive small molecules is a key step toward unraveling the molecular mechanisms underlying their bioactivity and predicting potential side effects or cross-reactivity. Recently, large datasets of protein-small molecule interactions have become available, providing a unique source of information for the development of knowledge-based approaches to computationally identify new targets for uncharacterized molecules or secondary targets for known molecules. Here, we introduce SwissTargetPrediction, a web server to accurately predict the targets of bioactive molecules based on a combination of 2D and 3D similarity measures with known ligands. Predictions can be carried out in five different organisms, and mapping predictions by homology within and between different species is enabled for close paralogs and orthologs. SwissTargetPrediction is accessible free of charge and without login requirement at http://www.swisstargetprediction.ch.
Resumo:
BACKGROUND: The Internet is increasingly used as a source of information for mental health issues. The burden of obsessive compulsive disorder (OCD) may lead persons with diagnosed or undiagnosed OCD, and their relatives, to search for good quality information on the Web. This study aimed to evaluate the quality of Web-based information on English-language sites dealing with OCD and to compare the quality of websites found through a general and a medically specialized search engine. METHODS: Keywords related to OCD were entered into Google and OmniMedicalSearch. Websites were assessed on the basis of accountability, interactivity, readability, and content quality. The "Health on the Net" (HON) quality label and the Brief DISCERN scale score were used as possible content quality indicators. Of the 235 links identified, 53 websites were analyzed. RESULTS: The content quality of the OCD websites examined was relatively good. The use of a specialized search engine did not offer an advantage in finding websites with better content quality. A score ≥16 on the Brief DISCERN scale is associated with better content quality. CONCLUSION: This study shows the acceptability of the content quality of OCD websites. There is no advantage in searching for information with a specialized search engine rather than a general one. Practical implications: The Internet offers a number of high quality OCD websites. It remains critical, however, to have a provider-patient talk about the information found on the Web.
Resumo:
The rapid adoption of online media like Facebook, Twitter or Wikileaks leaves us with little time to think. Where is information technology taking us, our society and our democratic institutions ? Is the Web replicating social divides that already exist offline or does collaborative technology pave the way for a more equal society ? How do we find the right balance between openness and privacy ? Can social media improve civic participation or do they breed superficial exchange and the promotion of false information ? These and lots of other questions arise when one starts to look at the Internet, society and politics. The first part of this paper gives an overview of the social changes that occur with the rise of the Web. The second part serves as an overview on how the Web is being used for political participation in Switzerland and abroad. Le développement rapide de nouveaux médias comme Facebook, Twitter ou Wikileaks ne laisse que peu de temps à la réflexion. Quels sont les changements que ces technologies de l'information impliquent pour nous, notre société et nos institutions démocratiques ? Internet ne fait-il que reproduire des divisions sociales qui lui préexistent ou constitue-t-il un moyen de lisser et d'égaliser ces mêmes divisions ? Comment trouver le bon équilibre entre transparence et respect de la vie privée ? Les médias sociaux permettent-ils de stimuler la participation politique ou ne sont-ils que le vecteur d'échanges superficiels et de fausses informations ? Ces questions, parmi d'autres, émergent rapidement lorsque l'on s'intéresse à la question des liens entre Internet, la société et la politique. La première partie de ce cahier est consacrée aux changements sociaux générés par l'émergence et le développement d'Internet. La seconde fait l'état des lieux de la manière dont Internet est utilisé pour stimuler la participation politique en Suisse et à l'étranger.