38 resultados para Topology-based methods
Resumo:
Background: Selecting the highest quality 3D model of a protein structure from a number of alternatives remains an important challenge in the field of structural bioinformatics. Many Model Quality Assessment Programs (MQAPs) have been developed which adopt various strategies in order to tackle this problem, ranging from the so called "true" MQAPs capable of producing a single energy score based on a single model, to methods which rely on structural comparisons of multiple models or additional information from meta-servers. However, it is clear that no current method can separate the highest accuracy models from the lowest consistently. In this paper, a number of the top performing MQAP methods are benchmarked in the context of the potential value that they add to protein fold recognition. Two novel methods are also described: ModSSEA, which based on the alignment of predicted secondary structure elements and ModFOLD which combines several true MQAP methods using an artificial neural network. Results: The ModSSEA method is found to be an effective model quality assessment program for ranking multiple models from many servers, however further accuracy can be gained by using the consensus approach of ModFOLD. The ModFOLD method is shown to significantly outperform the true MQAPs tested and is competitive with methods which make use of clustering or additional information from multiple servers. Several of the true MQAPs are also shown to add value to most individual fold recognition servers by improving model selection, when applied as a post filter in order to re-rank models. Conclusion: MQAPs should be benchmarked appropriately for the practical context in which they are intended to be used. Clustering based methods are the top performing MQAPs where many models are available from many servers; however, they often do not add value to individual fold recognition servers when limited models are available. Conversely, the true MQAP methods tested can often be used as effective post filters for re-ranking few models from individual fold recognition servers and further improvements can be achieved using a consensus of these methods.
Recent developments in genetic data analysis: what can they tell us about human demographic history?
Resumo:
Over the last decade, a number of new methods of population genetic analysis based on likelihood have been introduced. This review describes and explains the general statistical techniques that have recently been used, and discusses the underlying population genetic models. Experimental papers that use these methods to infer human demographic and phylogeographic history are reviewed. It appears that the use of likelihood has hitherto had little impact in the field of human population genetics, which is still primarily driven by more traditional approaches. However, with the current uncertainty about the effects of natural selection, population structure and ascertainment of single-nucleotide polymorphism markers, it is suggested that likelihood-based methods may have a greater impact in the future.
Resumo:
DIGE is a protein labelling and separation technique allowing quantitative proteomics of two or more samples by optical fluorescence detection of differentially labelled proteins that are electrophoretically separated on the same gel. DIGE is an alternative to quantitation by MS-based methodologies and can circumvent their analytical limitations in areas such as intact protein analysis, (linear) detection over a wide range of protein abundances and, theoretically, applications where extreme sensitivity is needed. Thus, in quantitative proteomics DIGE is usually complementary to MS-based quantitation and has some distinct advantages. This review describes the basics of DIGE and its unique properties and compares it to MS-based methods in quantitative protein expression analysis.
Resumo:
Objective: To test the hypothesis that measles vaccination was involved in the pathogenesis of autism spectrum disorders (ASD) as evidenced by signs of a persistent measles infection or abnormally persistent immune response shown by circulating measles virus or raised antibody titres in children with ASD who had been vaccinated against measles, mumps and rubella (MMR) compared with controls. Design: Case-control study, community based. Methods: A community sample of vaccinated children aged 10-12 years in the UK with ASD (n = 98) and two control groups of similar age, one with special educational needs but no ASD (n = 52) and one typically developing group (n = 90), were tested for measles virus and antibody response to measles in the serum. Results: No difference was found between cases and controls for measles antibody response. There was no dose-response relationship between autism symptoms and antibody concentrations. Measles virus nucleic acid was amplified by reverse transcriptase-PCR in peripheral blood mononuclear cells from one patient with autism and two typically developing children. There was no evidence of a differential response to measles virus or the measles component of the MMR in children with ASD, with or without regression, and controls who had either one or two doses of MMR. Only one child from the control group had clinical symptoms of possible enterocolitis. Conclusion: No association between measles vaccination and ASD was shown.
Resumo:
In this work a hybrid technique that includes probabilistic and optimization based methods is presented. The method is applied, both in simulation and by means of real-time experiments, to the heating unit of a Heating, Ventilation Air Conditioning (HVAC) system. It is shown that the addition of the probabilistic approach improves the fault diagnosis accuracy.
Resumo:
In situ analysis has become increasingly important for contaminated land investigation and remediation. At present, portable techniques are used mainly as scanning tools to assess the spread and magnitude of the contamination, and are an adjunct to conventional laboratory analyses. A site in Cornwall, containing naturally occurring radioactive material (NORM), provided an opportunity for Reading University PhD student Anna Kutner to compare analytical data collected in situ with data generated by laboratory-based methods. The preliminary results in this paper extend the author‟s poster presentation at last September‟s GeoSpec2010 conference held in Lancaster.
Resumo:
This study analyzes the issue of American option valuation when the underlying exhibits a GARCH-type volatility process. We propose the usage of Rubinstein's Edgeworth binomial tree (EBT) in contrast to simulation-based methods being considered in previous studies. The EBT-based valuation approach makes an implied calibration of the pricing model feasible. By empirically analyzing the pricing performance of American index and equity options, we illustrate the superiority of the proposed approach.
Resumo:
An algorithm for tracking multiple feature positions in a dynamic image sequence is presented. This is achieved using a combination of two trajectory-based methods, with the resulting hybrid algorithm exhibiting the advantages of both. An optimizing exchange algorithm is described which enables short feature paths to be tracked without prior knowledge of the motion being studied. The resulting partial trajectories are then used to initialize a fast predictor algorithm which is capable of rapidly tracking multiple feature paths. As this predictor algorithm becomes tuned to the feature positions being tracked, it is shown how the location of occluded or poorly detected features can be predicted. The results of applying this tracking algorithm to data obtained from real-world scenes are then presented.
Resumo:
The structure of the Arctic stratospheric polar vortex in three chemistry–climate models (CCMs) taken from the CCMVal-2 intercomparison is examined using zonal mean and geometric-based methods. The geometric methods are employed by taking 2D moments of potential vorticity fields that are representative of the polar vortices in each of the models. This allows the vortex area, centroid location and ellipticity to be determined, as well as a measure of vortex filamentation. The first part of the study uses these diagnostics to examine how well the mean state, variability and extreme variability of the polar vortices are represented in CCMs compared to ERA-40 reanalysis data, and in particular for the UMUKCA-METO, NIWA-SOCOL and CCSR/NIES models. The second part of the study assesses how the vortices are predicted to change in terms of the frequency of sudden stratospheric warmings and their general structure over the period 1960–2100. In general, it is found that the vortices are climatologically too far poleward in the CCMs and produce too few large-scale filamentation events. Only a small increase is observed in the frequency of sudden stratospheric warming events from the mean of the CCMVal-2 models, but the distribution of extreme variability throughout the winter period is shown to change towards the end of the twentyfirst century.
Resumo:
Diaminofluoresceins are widely used probes for detection and intracellular localization of NO formation in cultured/isolated cells and intact tissues. The fluorinated derivative, 4-amino-5-methylamino-2′,7′-difluorofluorescein (DAF-FM), has gained increasing popularity in recent years due to its improved NO-sensitivity, pH-stability, and resistance to photo-bleaching compared to the first-generation compound, DAF-2. Detection of NO production by either reagent relies on conversion of the parent compound into a fluorescent triazole, DAF-FM-T and DAF-2-T, respectively. While this reaction is specific for NO and/or reactive nitrosating species, it is also affected by the presence of oxidants/antioxidants. Moreover, the reaction with other molecules can lead to the formation of fluorescent products other than the expected triazole. Thus additional controls and structural confirmation of the reaction products are essential. Using human red blood cells as an exemplary cellular system we here describe robust protocols for the analysis of intracellular DAF-FM-T formation using an array of fluorescence-based methods (laser-scanning fluorescence microscopy, flow cytometry and fluorimetry) and analytical separation techniques (reversed-phase HPLC and LC-MS/MS). When used in combination, these assays afford unequivocal identification of the fluorescent signal as being derived from NO and are applicable to most other cellular systems without or with only minor modifications.
Resumo:
An automated cloud band identification procedure is developed that captures the meteorology of such events over southern Africa. This “metbot” is built upon a connected component labelling method that enables blob detection in various atmospheric fields. Outgoing longwave radiation is used to flag candidate cloud band days by thresholding the data and requiring detected blobs to have sufficient latitudinal extent and exhibit positive tilt. The Laplacian operator is used on gridded reanalysis variables to highlight other features of meteorological interest. The ability of this methodology to capture the significant meteorology and rainfall of these synoptic systems is tested in a case study. Usefulness of the metbot in understanding event to event similarities of meteorological features is demonstrated, highlighting features previous studies have noted as key ingredients to cloud band development in the region. Moreover, this allows the presentation of a composite cloud band life cycle for southern Africa events. The potential of metbot to study multiscale interactions is discussed, emphasising its key strength: the ability to retain details of extreme and infrequent events. It automatically builds a database that is ideal for research questions focused on the influence of intraseasonal to interannual variability processes on synoptic events. Application of the method to convergence zone studies and atmospheric river descriptions is suggested. In conclusion, a relation-building metbot can retain details that are often lost with object-based methods but are crucial in case studies. Capturing and summarising these details may be necessary to develop deeper process-level understanding of multiscale interactions.
Resumo:
The group of haemosporidian parasites is of general interest to basic and applied science, since several species infect mammals, leading to malaria and associated disease symptoms. Although the great majority of haemosporidian parasites appear in bird hosts, as in the case of Leucocytozoon buteonis, there is little genomic information about genetic aspects of their co-evolution with hosts. Consequently, there is a high need for parasite-enrichment strategies enabling further analyses of the genomes, namely without exposure to DNA-intercalating dyes. Here, we used flow cytometry without an additional labelling step to enrich L. buteonis from infected buzzard blood. A specific, defined area of two-dimensional scattergramms was sorted and the fraction was further analysed. The successful enrichment of L. buteonis in the sorted fraction was demonstrated by Giemsa-staining and qPCR revealing a clear increase of parasite-specific genes, while host-specific genes were significantly decreased. This is the first report describing a labelling-free enrichment approach of L. buteonis from infected buzzard blood. The enrichment of parasites presented here is free of nucleic acid-intercalating dyes which may interfere with fluorescence-based methods or subsequent sequencing approaches.
Resumo:
This paper describes a study of the use of immersive Virtual reality technologies in the design of a new hospital. It uses Schön’s concept of reflective practice and video-based methods to analyse the ways design teams approach and employ a full scale 3D immersive environment – a CAVE – in collaborative design work. The analysis describes four themes relating to reflective practice occurring in the setting: orienting to the CAVE technology itself, orienting to the representation of the specific design within the CAVE, activities accounting for, or exploring alternatives within the design for the use and users of the space, and more strategic interactions around how to best represent the design and model to the client within the CAVE setting. The analysis also reveals some unique aspects of design work in this environment. Perhaps most significantly, rather than enhancing or adding to an existing understanding of design through paper based or non-immersive digital representations, it is often acting to challenge or surprise the participants as they experience the immersive, full scale version of their own design.
Resumo:
Using the novel technique of topic modelling, this paper examines thematic patterns and their changes over time in a large corpus of corporate social responsibility (CSR) reports produced in the oil sector. Whereas previous research on corporate communications has been small-scale or interested in selected lexical aspects and thematic categories identified ex ante, our approach allows for thematic patterns to emerge from the data. The analysis reveals a number of major trends and topic shifts pointing to changing practices of CSR. Nowadays ‘people’, ‘communities’ and ‘rights’ seem to be given more prominence, whereas ‘environmental protection’ appears to be less relevant. Using more established corpus-based methods, we subsequently explore two top phrases - ‘human rights’ and ‘climate change’ that were identified as representative of the shifting thematic patterns. Our approach strikes a balance between the purely quantitative and qualitative methodologies and offers applied linguists new ways of exploring discourse in large collections of texts.
Resumo:
For the tracking of extrema associated with weather systems to be applied to a broad range of fields it is necessary to remove a background field that represents the slowly varying, large spatial scales. The sensitivity of the tracking analysis to the form of background field removed is explored for the Northern Hemisphere winter storm tracks for three contrasting fields from an integration of the U. K. Met Office's (UKMO) Hadley Centre Climate Model (HadAM3). Several methods are explored for the removal of a background field from the simple subtraction of the climatology, to the more sophisticated removal of the planetary scales. Two temporal filters are also considered in the form of a 2-6-day Lanczos filter and a 20-day high-pass Fourier filter. The analysis indicates that the simple subtraction of the climatology tends to change the nature of the systems to the extent that there is a redistribution of the systems relative to the climatological background resulting in very similar statistical distributions for both positive and negative anomalies. The optimal planetary wave filter removes total wavenumbers less than or equal to a number in the range 5-7, resulting in distributions more easily related to particular types of weather system. For the temporal filters the 2-6-day bandpass filter is found to have a detrimental impact on the individual weather systems, resulting in the storm tracks having a weak waveguide type of behavior. The 20-day high-pass temporal filter is less aggressive than the 2-6-day filter and produces results falling between those of the climatological and 2-6-day filters.