46 resultados para Computer Science, Interdisciplinary Applications
Resumo:
Motivation: Conformational flexibility is essential to the function of many proteins, e.g. catalytic activity. To assist efforts in determining and exploring the functional properties of a protein, it is desirable to automatically identify regions that are prone to undergo conformational changes. It was recently shown that a probabilistic predictor of continuum secondary structure is more accurate than categorical predictors for structurally ambivalent sequence regions, suggesting that such models are suited to characterize protein flexibility. Results: We develop a computational method for identifying regions that are prone to conformational change directly from the amino acid sequence. The method uses the entropy of the probabilistic output of an 8-class continuum secondary structure predictor. Results for 171 unique amino acid sequences with well-characterized variable structure (identified in the 'Macromolecular movements database') indicate that the method is highly sensitive at identifying flexible protein regions, but false positives remain a problem. The method can be used to explore conformational flexibility of proteins (including hypothetical or synthetic ones) whose structure is yet to be determined experimentally.
Resumo:
n-Octyl-beta-D-glueopyranoside (OG) is a non-ionic glycolipid, which is used widely in biotechnical and biochemical applications. All-atom molecular dynamics simulations from two different initial coordinates and velocities in explicit solvent have been performed to characterize the structural behaviour of an OG aggregate at equilibrium conditions. Geometric packing properties determined from the simulations and small angle neutron scattering experiment state that OG micelles are more likely to exist in a non-spherical shape, even at the concentration range near to the critical micelle concentration (0.025 M). Despite few large deviations in the principal moment of inertia ratios, the average micelle shape calculated from both simulations is a prolate ellipsoid. The deviations at these time scales are presumably the temporary shape change of a micelle. However, the size of the micelle and the accessible surface areas were constant during the simulations with the micelle surface being rough and partially elongated. Radial distribution functions computed for the hydroxyl oxygen atoms of an OG show sharper peaks at a minimum van der Waals contact distance than the acetal oxygen, ring oxygen, and anomeric carbon atoms. This result indicates that these atoms are pointed outwards at the hydrophilic/hydrophobic interface, form hydrogen bonds with the water molecules, and thus hydrate the micelle surface effectively. (c) 2005 Elsevier Inc. All rights reserved.
Resumo:
This article presents various novel and conventional planar electromagnetic bandgap (EBG)-assisted transmission lines. Both microstrip lines and coplanar waveguides (CPWs) are designed with circular, rectangular, annular, plus-sign and fractal-patterned EBGs and dumbbell-shaped defected ground structure (DGS). The dispersion characteristics and the slow-wave factors of the design are investigated. (c) 2006 Wiley Periodicals, Inc.
Resumo:
Eukaryotic genomes display segmental patterns of variation in various properties, including GC content and degree of evolutionary conservation. DNA segmentation algorithms are aimed at identifying statistically significant boundaries between such segments. Such algorithms may provide a means of discovering new classes of functional elements in eukaryotic genomes. This paper presents a model and an algorithm for Bayesian DNA segmentation and considers the feasibility of using it to segment whole eukaryotic genomes. The algorithm is tested on a range of simulated and real DNA sequences, and the following conclusions are drawn. Firstly, the algorithm correctly identifies non-segmented sequence, and can thus be used to reject the null hypothesis of uniformity in the property of interest. Secondly, estimates of the number and locations of change-points produced by the algorithm are robust to variations in algorithm parameters and initial starting conditions and correspond to real features in the data. Thirdly, the algorithm is successfully used to segment human chromosome 1 according to GC content, thus demonstrating the feasibility of Bayesian segmentation of eukaryotic genomes. The software described in this paper is available from the author's website (www.uq.edu.au/similar to uqjkeith/) or upon request to the author.
Resumo:
A set of techniques referred to as circular statistics has been developed for the analysis of directional and orientational data. The unit of measure for such data is angular (usually in either degrees or radians), and the statistical distributions underlying the techniques are characterised by their cyclic nature-for example, angles of 359.9 degrees are considered close to angles of 0 degrees. In this paper, we assert that such approaches can be easily adapted to analyse time-of-day and time-of-week data, and in particular daily cycles in the numbers of incidents reported to the police. We begin the paper by describing circular statistics. We then discuss how these may be modified, and demonstrate the approach with some examples for reported incidents in the Cardiff area of Wales. (c) 2005 Elsevier Ltd. All rights reserved.
Resumo:
This paper presents a scientific and technical description of the modelling framework and the main results of modelling the long-term average sediment delivery at hillslope to medium-scale catchments over the entire Murray Darling Basin (MDB). A theoretical development that relates long-term averaged sediment delivery to the statistics of rainfall and catchment parameters is presented. The derived flood frequency approach was adapted to investigate the problem of regionalization of the sediment delivery ratio (SDR) across the Basin. SDR, a measure of catchment response to the upland erosion rate, was modeled by two lumped linear stores arranged in series: hillslope transport to the nearest streams and flow routing in the channel network. The theory shows that the ratio of catchment sediment residence time (SRT) to average effective rainfall duration is the most important control in the sediment delivery processes. In this study, catchment SRTs were estimated using travel time for overland flow multiplied by an enlargement factor which is a function of particle size. Rainfall intensity and effective duration statistics were regionalized by using long-term measurements from 195 pluviograph sites within and around the Basin. Finally, the model was implemented across the MDB by using spatially distributed soil, vegetation, topographical and land use properties under Geographic Information System (GIs) environment. The results predict strong variations in SDR from close to 0 in floodplains to 70% in the eastern uplands of the Basin. (c) 2005 Elsevier Ltd. All rights reserved.
Resumo:
Irrigation practices that are profligate in their use of water have come under closer scrutiny by water managers and the public. Trickle irrigation has the propensity to increase water use efficiency but only if the system is designed to meet the soil and plant conditions. Recently we have provided a software tool, WetUp (http://www.clw.csiro.au/products/wetup/), to calculate the wetting patterns from trickle irrigation emitters. WetUp uses an analytical solution to calculate the wetted perimeter for both buried and surface emitters. This analytical solution has a number of assumptions, two of which are that the wetting front is defined by water content at which the hydraulic conductivity (K) is I mm day(-1) and that the flow occurs from a point source. Here we compare the wetting patterns calculated with a 2-dimensional numerical model, HYDRUS2D, for solving the water flow into typical soils with the analytical solution. The results show that the wetting patterns are similar, except when the soil properties result in the assumption of a point source no longer being a good description of the flow regime. Difficulties were also experienced with getting stable solutions with HYDRUS2D for soils with low hydraulic conductivities. (c) 2005 Elsevier Ltd. All rights reserved.
Resumo:
For second-hand products sold with warranty, the expected warranty cost for an item to the manufacturer, depends on (i) the age and/or usage as well as the maintenance history for the item and (ii) the terms of the warranty policy. The paper develops probabilistic models to compute the expected warranty cost to the manufacturer when the items are sold with free replacement or pro rata warranties. (C) 2000 Elsevier Science Ltd. All rights reserved.
Resumo:
The particle-based lattice solid model developed to study the physics of rocks and the nonlinear dynamics of earthquakes is refined by incorporating intrinsic friction between particles. The model provides a means for studying the causes of seismic wave attenuation, as well as frictional heat generation, fault zone evolution, and localisation phenomena. A modified velocity-Verlat scheme that allows friction to be precisely modelled is developed. This is a difficult computational problem given that a discontinuity must be accurately simulated by the numerical approach (i.e., the transition from static to dynamical frictional behaviour). This is achieved using a half time step integration scheme. At each half time step, a nonlinear system is solved to compute the static frictional forces and states of touching particle-pairs. Improved efficiency is achieved by adaptively adjusting the time step increment, depending on the particle velocities in the system. The total energy is calculated and verified to remain constant to a high precision during simulations. Numerical experiments show that the model can be applied to the study of earthquake dynamics, the stick-slip instability, heat generation, and fault zone evolution. Such experiments may lead to a conclusive resolution of the heat flow paradox and improved understanding of earthquake precursory phenomena and dynamics. (C) 1999 Academic Press.
Resumo:
Objective: To evaluate the reliability and validity of a brief physical activity assessment tool suitable for doctors to use to identify inactive patients in the primary care setting. Methods: Volunteer family doctors (n = 8) screened consenting patients (n = 75) for physical activity participation using a brief physical activity assessment tool. Inter-rater reliability was assessed within one week (n = 71). Validity was assessed against an objective physical activity monitor (computer science and applications accelerometer; n = 42). Results: The brief physical activity assessment tool produced repeatable estimates of sufficient total physical activity, correctly classifying over 76% of cases (kappa 0.53, 95% confidence interval (CI) 0.33 to 0.72). The validity coefficient was reasonable (kappa 0.40, 95% CI 0.12 to 0.69), with good percentage agreement (71%). Conclusions: The brief physical activity assessment tool is a reliable instrument, with validity similar to that of more detailed self report measures of physical activity. It is a tool that can be used efficiently in routine primary healthcare services to identify insufficiently active patients who may need physical activity advice.
Resumo:
Online geographic information systems provide the means to extract a subset of desired spatial information from a larger remote repository. Data retrieved representing real-world geographic phenomena are then manipulated to suit the specific needs of an end-user. Often this extraction requires the derivation of representations of objects specific to a particular resolution or scale from a single original stored version. Currently standard spatial data handling techniques cannot support the multi-resolution representation of such features in a database. In this paper a methodology to store and retrieve versions of spatial objects at, different resolutions with respect to scale using standard database primitives and SQL is presented. The technique involves heavy fragmentation of spatial features that allows dynamic simplification into scale-specific object representations customised to the display resolution of the end-user's device. Experimental results comparing the new approach to traditional R-Tree indexing and external object simplification reveal the former performs notably better for mobile and WWW applications where client-side resources are limited and retrieved data loads are kept relatively small.
Resumo:
Genetic algorithms (GAs) are known to locate the global optimal solution provided sufficient population and/or generation is used. Practically, a near-optimal satisfactory result can be found by Gas with a limited number of generations. In wireless communications, the exhaustive searching approach is widely applied to many techniques, such as maximum likelihood decoding (MLD) and distance spectrum (DS) techniques. The complexity of the exhaustive searching approach in the MLD or the DS technique is exponential in the number of transmit antennas and the size of the signal constellation for the multiple-input multiple-output (MIMO) communication systems. If a large number of antennas and a large size of signal constellations, e.g. PSK and QAM, are employed in the MIMO systems, the exhaustive searching approach becomes impractical and time consuming. In this paper, the GAs are applied to the MLD and DS techniques to provide a near-optimal performance with a reduced computational complexity for the MIMO systems. Two different GA-based efficient searching approaches are proposed for the MLD and DS techniques, respectively. The first proposed approach is based on a GA with sharing function method, which is employed to locate the multiple solutions of the distance spectrum for the Space-time Trellis Coded Orthogonal Frequency Division Multiplexing (STTC-OFDM) systems. The second approach is the GA-based MLD that attempts to find the closest point to the transmitted signal. The proposed approach can return a satisfactory result with a good initial signal vector provided to the GA. Through simulation results, it is shown that the proposed GA-based efficient searching approaches can achieve near-optimal performance, but with a lower searching complexity comparing with the original MLD and DS techniques for the MIMO systems.