852 resultados para Web-based tool
Resumo:
Recent research trends in computer-aided drug design have shown an increasing interest towards the implementation of advanced approaches able to deal with large amount of data. This demand arose from the awareness of the complexity of biological systems and from the availability of data provided by high-throughput technologies. As a consequence, drug research has embraced this paradigm shift exploiting approaches such as that based on networks. Indeed, the process of drug discovery can benefit from the implementation of network-based methods at different steps from target identification to drug repurposing. From this broad range of opportunities, this thesis is focused on three main topics: (i) chemical space networks (CSNs), which are designed to represent and characterize bioactive compound data sets; (ii) drug-target interactions (DTIs) prediction through a network-based algorithm that predicts missing links; (iii) COVID-19 drug research which was explored implementing COVIDrugNet, a network-based tool for COVID-19 related drugs. The main highlight emerged from this thesis is that network-based approaches can be considered useful methodologies to tackle different issues in drug research. In detail, CSNs are valuable coordinate-free, graphically accessible representations of structure-activity relationships of bioactive compounds data sets especially for medium-large libraries of molecules. DTIs prediction through the random walk with restart algorithm on heterogeneous networks can be a helpful method for target identification. COVIDrugNet is an example of the usefulness of network-based approaches for studying drugs related to a specific condition, i.e., COVID-19, and the same ‘systems-based’ approaches can be used for other diseases. To conclude, network-based tools are proving to be suitable in many applications in drug research and provide the opportunity to model and analyze diverse drug-related data sets, even large ones, also integrating different multi-domain information.
Resumo:
This document records the process of migrating eprints.org data to a Fez repository. Fez is a Web-based digital repository and workflow management system based on Fedora (http://www.fedora.info/). At the time of migration, the University of Queensland Library was using EPrints 2.2.1 [pepper] for its ePrintsUQ repository. Once we began to develop Fez, we did not upgrade to later versions of eprints.org software since we knew we would be migrating data from ePrintsUQ to the Fez-based UQ eSpace. Since this document records our experiences of migration from an earlier version of eprints.org, anyone seeking to migrate eprints.org data into a Fez repository might encounter some small differences. Moving UQ publication data from an eprints.org repository into a Fez repository (hereafter called UQ eSpace (http://espace.uq.edu.au/) was part of a plan to integrate metadata (and, in some cases, full texts) about all UQ research outputs, including theses, images, multimedia and datasets, in a single repository. This tied in with the plan to identify and capture the research output of a single institution, the main task of the eScholarshipUQ testbed for the Australian Partnership for Sustainable Repositories project (http://www.apsr.edu.au/). The migration could not occur at UQ until the functionality in Fez was at least equal to that of the existing ePrintsUQ repository. Accordingly, as Fez development occurred throughout 2006, a list of eprints.org functionality not currently supported in Fez was created so that programming of such development could be planned for and implemented.
Resumo:
In mapping the evolutionary process of online news and the socio-cultural factors determining this development, this paper has a dual purpose. First, in reworking the definition of “online communication”, it argues that despite its seemingly sudden emergence in the 1990s, the history of online news started right in the early days of the telegraphs and spread throughout the development of the telephone and the fax machine before becoming computer-based in the 1980s and Web-based in the 1990s. Second, merging macro-perspectives on the dynamic of media evolution by DeFleur and Ball-Rokeach (1989) and Winston (1998), the paper consolidates a critical point for thinking about new media development: that something technically feasible does not always mean that it will be socially accepted and/or demanded. From a producer-centric perspective, the birth and development of pre-Web online news forms have been more or less generated by the traditional media’s sometimes excessive hype about the power of new technologies. However, placing such an emphasis on technological potentials at the expense of their social conditions not only can be misleading but also can be detrimental to the development of new media, including the potential of today’s online news.
Resumo:
Trust is a vital feature for Semantic Web: If users (humans and agents) are to use and integrate system answers, they must trust them. Thus, systems should be able to explain their actions, sources, and beliefs, and this issue is the topic of the proof layer in the design of the Semantic Web. This paper presents the design and implementation of a system for proof explanation on the Semantic Web, based on defeasible reasoning. The basis of this work is the DR-DEVICE system that is extended to handle proofs. A critical aspect is the representation of proofs in an XML language, which is achieved by a RuleML language extension.
Resumo:
Computational models complement laboratory experimentation for efficient identification of MHC-binding peptides and T-cell epitopes. Methods for prediction of MHC-binding peptides include binding motifs, quantitative matrices, artificial neural networks, hidden Markov models, and molecular modelling. Models derived by these methods have been successfully used for prediction of T-cell epitopes in cancer, autoimmunity, infectious disease, and allergy. For maximum benefit, the use of computer models must be treated as experiments analogous to standard laboratory procedures and performed according to strict standards. This requires careful selection of data for model building, and adequate testing and validation. A range of web-based databases and MHC-binding prediction programs are available. Although some available prediction programs for particular MHC alleles have reasonable accuracy, there is no guarantee that all models produce good quality predictions. In this article, we present and discuss a framework for modelling, testing, and applications of computational methods used in predictions of T-cell epitopes. (C) 2004 Elsevier Inc. All rights reserved.
Resumo:
Telemedicine might increase the speed of diagnosis for leprosy and reduce the development of disabilities. We compared the accuracy of diagnosis made by telemedicine with that made by in-person examination. The cases were patients with suspected leprosy at eight public health clinics in outlying areas of the city of Sao Paulo. The case history and clinical examination data, and at least two clinical images for each patient, were stored in a web-based system developed for teledermatology. After the examination in the public clinic, patients then attended a teaching hospital for an in-person examination. The benchmark was the clinical examination of two dermatologists at the university hospital. From August 2005 to April 2006, 142 suspected cases of leprosy were forwarded to the website by the doctors at the clinics. Of these, 36 cases were excluded. There was overall agreement in the diagnosis of leprosy in 74% of the 106 remaining cases. The sensitivity was 78% and the specificity was 31%. Although the specificity was low, the study suggests that telemedicine may be a useful low-cost method for obtaining second opinions in programmes to control leprosy.
Resumo:
Management of rectal cancer has become increasingly complex and a multidisciplinary approach is considered of key importance for improving outcomes. A national survey among specialists involved in this multidisciplinary setting was performed. A web-based survey containing 11 questions regarding rectal cancer management was sent to surgeons and medical oncologists registered by their corresponding societies as members. Statistical analysis was performed using the chi-square and Fisher`s exact tests for all categorical variables according to response to individual questions. Multivariate analysis was performed using Cox`s logistic regression. Overall, 418 email recipients responded the survey. Local staging was performed without either magnetic resonance imaging or endorectal ultrasound by 64% of responders. Seventy-two percent considered that final management decision should be made after neoadjuvant chemoradiation therapy. Additionally, 46% considered that an alternative procedure (local excision or observation) was appropriate in a patient with a complete clinical response. Colorectal surgeons were more frequently in favor of longer intervals after completion of chemoradiation therapy (P = 0.001) and of alternative management procedures after a complete clinical response (P = 0.02). After multivariate analysis, the choice of a watch and wait approach after a complete clinical response following neoadjuvant chemoradiation therapy was significantly more frequent among surgeons (OR 3.5, 95% CI 1.8-7.1). Surgeons seem to be more in favor of tailoring management of rectal cancer according to tumor response after neoadjuvant chemoradiation therapy, with longer intervals after chemoradiation therapy, decisions about treatment strategy being made after chemoradiation therapy instead of before, and the use of alternative surgical procedures after a complete clinical response following neoadjuvant therapy.
Resumo:
The fate of N-15-nitrogen-enriched formulated feed fed to shrimp was traced through the food web in shallow, outdoor tank systems (1000 1) stocked with shrimp. Triplicate tanks containing shrimp water with and without sediment were used to identify the role of the natural biota in the water column and sediment in processing dietary nitrogen (N). A preliminary experiment demonstrated that N-15-nitrogen-enriched feed products could be detected in the food web. Based on this, a 15-day experiment was conducted. The ammonium (NH4+) pool in the water column became rapidly enriched (within one day) with N-15-nitrogen after shrimp were fed N-15-enriched feed. By day 15, 6% of the added N-15-nitrogen was in this fraction in the 'sediment' tanks compared with 0.4% in the 'no sediment' tanks. The particulate fraction in the water column, principally autotrophic nanoflagellates, accounted for 4-5% of the N-15-nitrogen fed to shrimp after one day. This increased to 16% in the 'no sediment' treatment, and decreased to 2% in the 'sediment' treatment by day 15. It appears that dietary N was more accessible to the phytoplankton community in the absence of sediment. The difference is possibly because a proportion of the dietary N was buried in the sediment in the 'sediment' treatment, making it unavailable to the phytoplankton. Alternatively, the dietary N was retained in the NH4+ pool in the water column since phytoplankton growth, and hence, N utilization was lower in the 'sediment' treatment. The lower growth of phytoplankton in the 'sediment' treatment appeared to be related to higher turbidity, and hence, lower light availability for growth. The percentage N-15-nitrogen detected in the sediment was only 6% despite the high capacity for sedimentation of the large biomass of plankton detritus and shrimp waste. This suggests rapid remineralization of organic waste by the microbial community in the sediment resulting in diffusion of inorganic N sources into the water column. It is likely that most of the dietary N will ultimately be removed from the tank system by water discharges. Our study showed that N-15-nitrogen derived from aquaculture feed can be processed by the microbial community in outdoor aquaculture systems and provides a method for determining the effect of dietary N on ecosystems. However, a significant amount of the dietary N was not retained by the natural biota and is likely to be present in the soluble organic fraction. (C) 2002 Elsevier Science B.V. All rights reserved.
Resumo:
Introduction: The present paper deals with the issue of the increasing usage of corporation mergers and acquisitions strategies within pharmaceutical industry environment. The aim is to identify the triggers of such business phenomenon and the immediate impact on the financial outcome of two powerful biopharmaceutical corporations: Pfizer and GlaxoSmithKline, which have been sampled due to their successful approach of the tactics in question. Materials and Methods: In order to create an overview of the development steps through mergers and acquisitions, the historical data of the two corporations has been consulted, from their official websites. The most relevant events were then associated with adequate information from the financial reports and statements of the two corporations indulged by web-based financial data providers. Results and Discussions: In the past few decades Pfizer and GlaxoSmithKline have purchased or merged with various companies in order to monopolize new markets, diversify products and services portfolios, survive and surpass competitors. The consequences proved to be positive although this approach implies certain capital availability. Conclusions: Results reveal the fact that, as far as the two sampled companies are concerned, acquisitions and mergers are reactions at the pressure of the highly competitive environment. Moreover, the continuous diversification of the market’s needs is also a consistent motive. However, the prevalence and the eminence of mergers and acquisition strategies are conditioned by the tender offer, the announcer’s caliber, research and development status and further other factors determined by the internal and external actors of the market.
Resumo:
Actualmente verifica-se que a complexidade dos sistemas informáticos tem vindo a aumentar, fazendo parte das nossas ferramentas diárias de trabalho a utilização de sistemas informáticos e a utilização de serviços online. Neste âmbito, a internet obtém um papel de destaque junto das universidades, ao permitir que alunos e professores possam interagir mais facilmente. A internet e a educação baseada na Web vêm oferecer acesso remoto a qualquer informação independentemente da localização ou da hora. Como consequência, qualquer pessoa com uma ligação à internet, ao poder adquirir informações sobre um determinado tema junto dos maiores peritos, obtém vantagens significativas. Os laboratórios remotos são uma solução muito valorizada no que toca a interligar tecnologia e recursos humanos em ambientes que podem estar afastados no tempo ou no espaço. A criação deste tipo de laboratórios e a sua utilidade real só é possível porque as tecnologias de comunicação emergentes têm contribuído de uma forma muito relevante para melhorar a sua disponibilização à distância. A necessidade de criação de laboratórios remotos torna-se imprescindível para pesquisas relacionadas com engenharia que envolvam a utilização de recursos escassos ou de grandes dimensões. Apoiado neste conceito, desenvolveu-se um laboratório remoto para os alunos de engenharia que precisam de testar circuitos digitais numa carta de desenvolvimento de hardware configurável, permitindo a utilização deste recurso de uma forma mais eficiente. O trabalho consistiu na criação de um laboratório remoto de baixo custo, com base em linguagens de programação open source, sendo utilizado como unidade de processamento um router da ASUS com o firmware OpenWrt. Este firmware é uma distribuição Linux para sistemas embutidos. Este laboratório remoto permite o teste dos circuitos digitais numa carta de desenvolvimento de hardware configurável em tempo real, utilizando a interface JTAG. O laboratório desenvolvido tem a particularidade de ter como unidade de processamento um router. A utilização do router como servidor é uma solução muito pouco usual na implementação de laboratórios remotos. Este router, quando comparado com um computador normal, apresenta uma capacidade de processamento e memória muito inferior, embora os testes efectuados provassem que apresenta um desempenho muito adequado às expectativas.
Resumo:
MSC Dissertation in Computer Engineering