66 resultados para Group theoretical based techniques

em QUB Research Portal - Research Directory and Institutional Repository for Queen's University Belfast


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Accurate in silico models for the quantitative prediction of the activity of G protein-coupled receptor (GPCR) ligands would greatly facilitate the process of drug discovery and development. Several methodologies have been developed based on the properties of the ligands, the direct study of the receptor-ligand interactions, or a combination of both approaches. Ligand-based three-dimensional quantitative structure-activity relationships (3D-QSAR) techniques, not requiring knowledge of the receptor structure, have been historically the first to be applied to the prediction of the activity of GPCR ligands. They are generally endowed with robustness and good ranking ability; however they are highly dependent on training sets. Structure-based techniques generally do not provide the level of accuracy necessary to yield meaningful rankings when applied to GPCR homology models. However, they are essentially independent from training sets and have a sufficient level of accuracy to allow an effective discrimination between binders and nonbinders, thus qualifying as viable lead discovery tools. The combination of ligand and structure-based methodologies in the form of receptor-based 3D-QSAR and ligand and structure-based consensus models results in robust and accurate quantitative predictions. The contribution of the structure-based component to these combined approaches is expected to become more substantial and effective in the future, as more sophisticated scoring functions are developed and more detailed structural information on GPCRs is gathered.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Developing a desirable framework for handling inconsistencies in software requirements specifications is a challenging problem. It has been widely recognized that the relative priority of requirements can help developers to make some necessary trade-off decisions for resolving con- flicts. However, for most distributed development such as viewpoints-based approaches, different stakeholders may assign different levels of priority to the same shared requirements statement from their own perspectives. The disagreement in the local levels of priority assigned to the same shared requirements statement often puts developers into a dilemma during the inconsistency handling process. The main contribution of this paper is to present a prioritized merging-based framework for handling inconsistency in distributed software requirements specifications. Given a set of distributed inconsistent requirements collections with the local prioritization, we first construct a requirements specification with a prioritization from an overall perspective. We provide two approaches to constructing a requirements specification with the global prioritization, including a merging-based construction and a priority vector-based construction. Following this, we derive proposals for handling inconsistencies from the globally prioritized requirements specification in terms of prioritized merging. Moreover, from the overall perspective, these proposals may be viewed as the most appropriate to modifying the given inconsistent requirements specification in the sense of the ordering relation over all the consistent subsets of the requirements specification. Finally, we consider applying negotiation-based techniques to viewpoints so as to identify an acceptable common proposal from these proposals.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Conflicting results have been reported on the detection of paramyxovirus transcripts in Paget's disease, and a possible explanation is differences in the sensitivity of RT-PCR methods for detecting virus. In a blinded study, we found no evidence to suggest that laboratories that failed to detect viral transcripts had less sensitive RT-PCR assays, and we did not detect measles or distemper transcripts in Paget's samples using the most sensitive assays evaluated.

Introduction: There is conflicting evidence on the possible role of persistent paramyxovirus infection in Paget's disease of bone (PDB). Some workers have detected measles virus (MV) or canine distemper virus (CDV) transcripts in cells and tissues from patients with PDB, but others have failed to confirm this finding. A possible explanation might be differences in the sensitivity of RT-PCR methods for detecting virus. Here we performed a blinded comparison of the sensitivity of different RT-PCR-based techniques for MV and CDV detection in different laboratories and used the most sensitive assays to screen for evidence of viral transcripts in bone and blood samples derived from patients with PDB.

Materials and Methods: Participating laboratories analyzed samples spiked with known amounts of MV and CDV transcripts and control samples that did not contain viral nucleic acids. All analyses were performed on a blinded basis.

Results: The limit of detection for CDV was 1000 viral transcripts in three laboratories (Aberdeen, Belfast, and Liverpool) and 10,000 transcripts in another laboratory (Manchester). The limit of detection for MV was 16 transcripts in one laboratory (NIBSC), 1000 transcripts in two laboratories (Aberdeen and Belfast), and 10,000 transcripts in two laboratories (Liverpool and Manchester). An assay previously used by a U.S.-based group to detect MV transcripts in PDB had a sensitivity of 1000 transcripts. One laboratory (Manchester) detected CDV transcripts in a negative control and in two samples that had been spiked with MV. None of the other laboratories had false-positive results for MV or CDV, and no evidence of viral transcripts was found on analysis of 12 PDB samples using the most sensitive RT-PCR assays for MV and CDV.

Conclusions: We found that RT-PCR assays used by different laboratories differed in their sensitivity to detect CDV and MV transcripts but found no evidence to suggest that laboratories that previously failed to detect viral transcripts had less sensitive RT-PCR assays than those that detected viral transcripts. False-positive results were observed with one laboratory, and we failed to detect paramyxovirus transcripts in PDB samples using the most sensitive assays evaluated. Our results show that failure of some laboratories to detect viral transcripts is unlikely to be caused by problems with assay sensitivity and highlight the fact that contamination can be an issue when searching for pathogens by sensitive RT-PCR-based techniques.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This research aims to use the multivariate geochemical dataset, generated by the Tellus project, to investigate the appropriate use of transformation methods to maintain the integrity of geochemical data and inherent constrained behaviour in multivariate relationships. The widely used normal score transform is compared with the use of a stepwise conditional transform technique. The Tellus Project, managed by GSNI and funded by the Department of Enterprise Trade and Development and the EU’s Building Sustainable Prosperity Fund, involves the most comprehensive geological mapping project ever undertaken in Northern Ireland. Previous study has demonstrated spatial variability in the Tellus data but geostatistical analysis and interpretation of the datasets requires use of an appropriate methodology that reproduces the inherently complex multivariate relations. Previous investigation of the Tellus geochemical data has included use of Gaussian-based techniques. However, earth science variables are rarely Gaussian, hence transformation of data is integral to the approach. The multivariate geochemical dataset generated by the Tellus project provides an opportunity to investigate the appropriate use of transformation methods, as required for Gaussian-based geostatistical analysis. In particular, the stepwise conditional transform is investigated and developed for the geochemical datasets obtained as part of the Tellus project. The transform is applied to four variables in a bivariate nested fashion due to the limited availability of data. Simulation of these transformed variables is then carried out, along with a corresponding back transformation to original units. Results show that the stepwise transform is successful in reproducing both univariate statistics and the complex bivariate relations exhibited by the data. Greater fidelity to multivariate relationships will improve uncertainty models, which are required for consequent geological, environmental and economic inferences.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Despite ethical and technical concerns, the in vivo method, or more commonly referred to mouse bioassay (MBA), is employed globally as a reference method for phycotoxin analysis in shellfish. This is particularly the case for paralytic shellfish poisoning (PSP) and emerging toxin monitoring. A high-performance liquid chromatography method (HPLC-FLD) has been developed for PSP toxin analysis, but due to difficulties and limitations in the method, this procedure has not been fully implemented as a replacement. Detection of the diarrhetic shellfish poisoning (DSP) toxins has moved towards LC-mass spectrometry (MS) analysis, whereas the analysis of the amnesic shellfish poisoning (ASP) toxin domoic acid is performed by HPLC. Although alternative methods of detection to the MBA have been described, each procedure is specific for a particular toxin and its analogues, with each group of toxins requiring separate analysis utilising different extraction procedures and analytical equipment. In addition, consideration towards the detection of unregulated and emerging toxins on the replacement of the MBA must be given. The ideal scenario for the monitoring of phycotoxins in shellfish and seafood would be to evolve to multiple toxin detection on a single bioanalytical sensing platform, i.e. 'an artificial mouse'. Immunologically based techniques and in particular surface plasmon resonance technology have been shown as a highly promising bioanalytical tool offering rapid, real-time detection requiring minimal quantities of toxin standards. A Biacore Q and a prototype multiplex SPR biosensor have been evaluated for their ability to be fit for purpose for the simultaneous detection of key regulated phycotoxin groups and the emerging toxin palytoxin. Deemed more applicable due to the separate flow channels, the prototype performance for domoic acid, okadaic acid, saxitoxin, and palytoxin calibration curves in shellfish achieved detection limits (IC20) of 4,000, 36, 144 and 46 μg/kg of mussel, respectively. A one-step extraction procedure demonstrated recoveries greater than 80 % for all toxins. For validation of the method at the 95 % confidence limit, the decision limits (CCα) determined from an extracted matrix curve were calculated to be 450, 36 and 24 μg/kg, and the detection capability (CCβ) as a screening method is ≤10 mg/kg, ≤160 μg/kg and ≤400 μg/kg for domoic acid, okadaic acid and saxitoxin, respectively.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The selective catalytic reduction (SCR) of NOx compounds with NH3 is a hot topic in recent years. Among various catalysts, zeolites are proved to be efficient and promising for NH3-SCR, yet the whole processes and intrinsic mechanism are still not well understood due to the structural complexity of zeolites. With the improvement of theoretical chemistry techniques, quantum-chemical calculations are now capable of modeling the structure, acidity, adsorption, and ultimately reaction pathways over zeolites to some extent. In this review, a brief summary of relevant concepts of NH3-SCR is presented. Cluster approaches, embedded techniques, and periodic treatments are described as three main methods. Details of quantum-chemical investigations toward the key issues such as, the structure of active sites, the adsorption of small molecules, and the reaction mechanism of NH3-SCR over zeolites are discussed. Finally, a perspective for future theoretical research is given. 

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Reliability has emerged as a critical design constraint especially in memories. Designers are going to great lengths to guarantee fault free operation of the underlying silicon by adopting redundancy-based techniques, which essentially try to detect and correct every single error. However, such techniques come at a cost of large area, power and performance overheads which making many researchers to doubt their efficiency especially for error resilient systems where 100% accuracy is not always required. In this paper, we present an alternative method focusing on the confinement of the resulting output error induced by any reliability issues. By focusing on memory faults, rather than correcting every single error the proposed method exploits the statistical characteristics of any target application and replaces any erroneous data with the best available estimate of that data. To realize the proposed method a RISC processor is augmented with custom instructions and special-purpose functional units. We apply the method on the proposed enhanced processor by studying the statistical characteristics of the various algorithms involved in a popular multimedia application. Our experimental results show that in contrast to state-of-the-art fault tolerance approaches, we are able to reduce runtime and area overhead by 71.3% and 83.3% respectively.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Accurate and efficient grid based techniques for the solution of the time-dependent Schrodinger equation for few-electron diatomic molecules irradiated by intense, ultrashort laser pulses are described. These are based on hybrid finite-difference, Lagrange mesh techniques. The methods are applied in three scenarios, namely H-2(+) with fixed internuclear separation, H-2(+) with vibrating nuclei and H-2 with fixed internuclear separation and illustrative results presented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we present a new approach to visual speech recognition which improves contextual modelling by combining Inter-Frame Dependent and Hidden Markov Models. This approach captures contextual information in visual speech that may be lost using a Hidden Markov Model alone. We apply contextual modelling to a large speaker independent isolated digit recognition task, and compare our approach to two commonly adopted feature based techniques for incorporating speech dynamics. Results are presented from baseline feature based systems and the combined modelling technique. We illustrate that both of these techniques achieve similar levels of performance when used independently. However significant improvements in performance can be achieved through a combination of the two. In particular we report an improvement in excess of 17% relative Word Error Rate in comparison to our best baseline system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Relatively little is known about the biology and ecology of the world's largest (heaviest) bony fish, the ocean sunfish Mola mola, despite its worldwide occurrence in temperate and tropical seas. Studies are now emerging that require many common perceptions about sunfish behaviour and ecology to be re-examined. Indeed, the long-held view that ocean sunfish are an inactive, passively drifting species seems to be entirely misplaced. Technological advances in marine telemetry are revealing distinct behavioural patterns and protracted seasonal movements. Extensive forays by ocean sunfish into the deep ocean have been documented and broad-scale surveys, together with molecular and laboratory based techniques, are addressing the connectivity and trophic role of these animals. These emerging molecular and movement studies suggest that local distinct populations may be prone to depletion through bycatch in commercial fisheries. Rising interest in ocean sunfish, highlighted by the increase in recent publications, warrants a thorough review of the biology and ecology of this species. Here we review the taxonomy, morphology, geography, diet, locomotion, vision, movements, foraging ecology, reproduction and species interactions of M. mola. We present a summary of current conservation issues and suggest methods for addressing fundamental gaps in our knowledge.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Within the marine environment, aerial surveys have historically centred on apex predators, such as pinnipeds, cetaceans and sea birds. However, it is becoming increasingly apparent that the utility of this technique may also extend to subsurface species such as pre-spawning fish stocks and aggregations of jellyfish that occur close to the surface. In light of this, we tested the utility of aerial surveys to provide baseline data for 3 poorly understood scyphozoan jellyfish found throughout British and Irish waters: Rhizostoma octopus, Cyanea capillata and Chrysaora hysoscella. Our principal objectives were to develop a simple sampling protocol to identify and quantify surface aggregations, assess their consistency in space and time, and consider the overall applicability of this technique to the study of gelatinous zooplankton. This approach provided a general understanding of range and relative abundance for each target species, with greatest suitability to the study of R. octopus. For this species it was possible to identify and monitor extensive, temporally consistent and previously undocumented aggregations throughout the Irish Sea, an area spanning thousands of square kilometres. This finding has pronounced implications for ecologists and fisheries managers alike and, moreover, draws attention to the broad utility of aerial surveys for the study of gelatinous aggregations beyond the range of conventional ship-based techniques.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This focused review article discusses in detail, all available high-resolution small molecule ligand/G-quadruplex structural data derived from crystallographic and NMR based techniques, in an attempt to understand key factors in ligand binding and to highlight the biological importance of these complexes. In contrast to duplex DNA, G-quadruplexes are four-stranded nucleic acid structures folded from guanine rich repeat sequences stabilized by the stacking of guanine G-quartets and extensive Watson-Crick/Hoogsteen hydrogen bonding. Thermally stable, these topologies can play a role in telomere regulation and gene expression. The core structures of G-quadruplexes form stable scaffolds while the loops have been shown, by the addition of small molecule ligands, to be sufficiently adaptable to generate new and extended binding platforms for ligands to associate, either by extending G-quartet surfaces or by forming additional planar dinucleotide pairings. Many of these structurally characterised loop rearrangements were totally unexpected opening up new opportunities for the design of selective ligands. However these rearrangements do significantly complicate attempts to rationally design ligands against well defined but unbound topologies, as seen for the series of napthalene diimides complexes. Drawing together previous findings and with the introduction of two new crystallographic quadruplex/ligand structures we aim to expand the understanding of possible structural adaptations available to quadruplexes in the presence of ligands, thereby aiding in the design of new selective entities. (C) 2011 Elsevier Masson SAS. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose: Theoretical modelling techniques are often used to simulate the action of ionizing radiations on cells at the nanometre level, Using monoenergetic vacuum-UV (VUV) radiation to irradiate DNA either dry or humidified, the action spectra for the induction of DNA damage by low energy photons and the role of water and can be studied. These data provide inputs for the theoretical models.