884 resultados para 080403 Data Structures


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: The structure of proteins may change as a result of the inherent flexibility of some protein regions. We develop and explore probabilistic machine learning methods for predicting a continuum secondary structure, i.e. assigning probabilities to the conformational states of a residue. We train our methods using data derived from high-quality NMR models. Results: Several probabilistic models not only successfully estimate the continuum secondary structure, but also provide a categorical output on par with models directly trained on categorical data. Importantly, models trained on the continuum secondary structure are also better than their categorical counterparts at identifying the conformational state for structurally ambivalent residues. Conclusion: Cascaded probabilistic neural networks trained on the continuum secondary structure exhibit better accuracy in structurally ambivalent regions of proteins, while sustaining an overall classification accuracy on par with standard, categorical prediction methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective To compare mortality burden estimates based on direct measurement of levels and causes in communities with indirect estimates based on combining health facility cause-specific mortality structures with community measurement of mortality levels. Methods. Data from sentinel vital registration (SVR) with verbal autopsy (VA) were used to determine the cause-specific mortality burden at the community level in two areas of the United Republic of Tanzania. Proportional cause-specific mortality structures from health facilities were applied to counts of deaths obtained by SVR to produce modelled estimates. The burden was expressed in years of life lost. Findings. A total of 2884 deaths were recorded from health facilities and 2167 recorded from SVR/VAs. In the perinatal and neonatal age group cause-specific mortality rates were dominated by perinatal conditions and stillbirths in both the community and the facility data. The modelled estimates for chronic causes were very similar to those from SVR/VA. Acute febrile illnesses were coded more specifically in the facility data than in the VA. Injuries were more prevalent in the SVR/VA data than in that from the facilities. Conclusion. In this setting, improved International classification of diseases and health related problems, tenth revision (ICD-10) coding practices and applying facility-based cause structures to counts of deaths from communities, derived from SVR, appears to produce reasonable estimates of the cause-specific mortality burden in those aged 5 years and older determined directly from VA. For the perinatal and neonatal age group, VA appears to be required. Use of this approach in a nationally representative sample of facilities may produce reliable national estimates of the cause-specific mortality burden for leading causes of death in adults.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Our studies of the teeth and faces of Australian twins commenced at the School of Dentistry, The University of Adelaide in the early 1980s. There are now over 900 pairs of twins enrolled in our continuing investigations, together with 1200 relatives. There are 3 main cohorts of participants. The first cohort comprises around 300 pairs of teenage twins for whom various records have been collected, including dental casts, facial photographs, finger and palm prints and information on laterality, including handedness. The second cohort comprises around 300 pairs of twins who have been examined at 3 stages of dental development from approximately 4 years of age to about 14 years: at primary, mixed, and permanent dentition (excluding 3rd molars) stages. The most recent study of tooth emergence and oral health, for which we are currently recruiting twins, will provide a third cohort of around 500 twin pairs aged from around birth to 3 to 4 years of age. Our broad aim in these studies has been to improve our understanding of how genetic and environmental factors contribute to variation in dental and facial features, and to oral health. We have also used our data to investigate aspects of the determination of laterality, particularly the fascinating phenomenon of mirror imaging. We plan to maximize the use of the longitudinal data and DNA we have collected, and continue to collect, by performing genome-wide scans for putative genetic linkage peaks for a range of dental features, and then to test for association between a series of likely candidate genes and our phenotypes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We use molecular dynamics simulations to compare the conformational structure and dynamics of a 21-base pair RNA sequence initially constructed according to the canonical A-RNA and A'-RNA forms in the presence of counterions and explicit water. Our study aims to add a dynamical perspective to the solid-state structural information that has been derived from X-ray data for these two characteristic forms of RNA. Analysis of the three main structural descriptors commonly used to differentiate between the two forms of RNA namely major groove width, inclination and the number of base pairs in a helical twist over a 30 ns simulation period reveals a flexible structure in aqueous solution with fluctuations in the values of these structural parameters encompassing the range between the two crystal forms and more. This provides evidence to suggest that the identification of distinct A-RNA and A'-RNA structures, while relevant in the crystalline form, may not be generally relevant in the context of RNA in the aqueous phase. The apparent structural flexibility observed in our simulations is likely to bear ramifications for the interactions of RNA with biological molecules (e.g. proteins) and non-biological molecules (e.g. non-viral gene delivery vectors). © CSIRO 2009.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The modelling of mechanical structures using finite element analysis has become an indispensable stage in the design of new components and products. Once the theoretical design has been optimised a prototype may be constructed and tested. What can the engineer do if the measured and theoretically predicted vibration characteristics of the structure are significantly different? This thesis considers the problems of changing the parameters of the finite element model to improve the correlation between a physical structure and its mathematical model. Two new methods are introduced to perform the systematic parameter updating. The first uses the measured modal model to derive the parameter values with the minimum variance. The user must provide estimates for the variance of the theoretical parameter values and the measured data. Previous authors using similar methods have assumed that the estimated parameters and measured modal properties are statistically independent. This will generally be the case during the first iteration but will not be the case subsequently. The second method updates the parameters directly from the frequency response functions. The order of the finite element model of the structure is reduced as a function of the unknown parameters. A method related to a weighted equation error algorithm is used to update the parameters. After each iteration the weighting changes so that on convergence the output error is minimised. The suggested methods are extensively tested using simulated data. An H frame is then used to demonstrate the algorithms on a physical structure.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A periodic density functional theory method using the B3LYP hybrid exchange-correlation potential is applied to the Prussian blue analogue RbMn[Fe(CN)6] to evaluate the suitability of the method for studying, and predicting, the photomagnetic behavior of Prussian blue analogues and related materials. The method allows correct description of the equilibrium structures of the different electronic configurations with regard to the cell parameters and bond distances. In agreement with the experimental data, the calculations have shown that the low-temperature phase (LT; Fe(2+)(t(6)2g, S = 0)-CN-Mn(3+)(t(3)2g e(1)g, S = 2)) is the stable phase at low temperature instead of the high-temperature phase (HT; Fe(3+)(t(5)2g, S = 1/2)-CN-Mn(2+)(t(3)2g e(2)g, S = 5/2)). Additionally, the method gives an estimation for the enthalpy difference (HT LT) with a value of 143 J mol(-1) K(-1). The comparison of our calculations with experimental data from the literature and from our calorimetric and X-ray photoelectron spectroscopy measurements on the Rb0.97Mn[Fe(CN)6]0.98 x 1.03 H2O compound is analyzed, and in general, a satisfactory agreement is obtained. The method also predicts the metastable nature of the electronic configuration of the high-temperature phase, a necessary condition to photoinduce that phase at low temperatures. It gives a photoactivation energy of 2.36 eV, which is in agreement with photoinduced demagnetization produced by a green laser.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Analyzing geographical patterns by collocating events, objects or their attributes has a long history in surveillance and monitoring, and is particularly applied in environmental contexts, such as ecology or epidemiology. The identification of patterns or structures at some scales can be addressed using spatial statistics, particularly marked point processes methodologies. Classification and regression trees are also related to this goal of finding "patterns" by deducing the hierarchy of influence of variables on a dependent outcome. Such variable selection methods have been applied to spatial data, but, often without explicitly acknowledging the spatial dependence. Many methods routinely used in exploratory point pattern analysis are2nd-order statistics, used in a univariate context, though there is also a wide literature on modelling methods for multivariate point pattern processes. This paper proposes an exploratory approach for multivariate spatial data using higher-order statistics built from co-occurrences of events or marks given by the point processes. A spatial entropy measure, derived from these multinomial distributions of co-occurrences at a given order, constitutes the basis of the proposed exploratory methods. © 2010 Elsevier Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Analyzing geographical patterns by collocating events, objects or their attributes has a long history in surveillance and monitoring, and is particularly applied in environmental contexts, such as ecology or epidemiology. The identification of patterns or structures at some scales can be addressed using spatial statistics, particularly marked point processes methodologies. Classification and regression trees are also related to this goal of finding "patterns" by deducing the hierarchy of influence of variables on a dependent outcome. Such variable selection methods have been applied to spatial data, but, often without explicitly acknowledging the spatial dependence. Many methods routinely used in exploratory point pattern analysis are2nd-order statistics, used in a univariate context, though there is also a wide literature on modelling methods for multivariate point pattern processes. This paper proposes an exploratory approach for multivariate spatial data using higher-order statistics built from co-occurrences of events or marks given by the point processes. A spatial entropy measure, derived from these multinomial distributions of co-occurrences at a given order, constitutes the basis of the proposed exploratory methods. © 2010 Elsevier Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The principled statistical application of Gaussian random field models used in geostatistics has historically been limited to data sets of a small size. This limitation is imposed by the requirement to store and invert the covariance matrix of all the samples to obtain a predictive distribution at unsampled locations, or to use likelihood-based covariance estimation. Various ad hoc approaches to solve this problem have been adopted, such as selecting a neighborhood region and/or a small number of observations to use in the kriging process, but these have no sound theoretical basis and it is unclear what information is being lost. In this article, we present a Bayesian method for estimating the posterior mean and covariance structures of a Gaussian random field using a sequential estimation algorithm. By imposing sparsity in a well-defined framework, the algorithm retains a subset of “basis vectors” that best represent the “true” posterior Gaussian random field model in the relative entropy sense. This allows a principled treatment of Gaussian random field models on very large data sets. The method is particularly appropriate when the Gaussian random field model is regarded as a latent variable model, which may be nonlinearly related to the observations. We show the application of the sequential, sparse Bayesian estimation in Gaussian random field models and discuss its merits and drawbacks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis is concerned with the investigation, by nuclear magnetic resonance spectroscopy, of the molecular interactions occurring in mixtures of benzene and cyclohexane to which either chloroform or deutero-chloroform has been added. The effect of the added polar molecule on the liquid structure has been studied using spin-lattice relaxation time, 1H chemical shift, and nuclear Overhauser effect measurements. The main purpose of the work has been to validate a model for molecular interaction involving local ordering of benzene around chloroform. A chemical method for removing dissolved oxygen from samples has been developed to encompass a number of types of sample, including quantitative mixtures, and its supremacy over conventional deoxygenation technique is shown. A set of spectrometer conditions, the use of which produces the minimal variation in peak height in the steady state, is presented. To separate the general diluting effects of deutero-chloroform from its effects due to the production of local order a series of mixtures involving carbon tetrachloride, instead of deutero-chloroform, have been used as non-interacting references. The effect of molecular interaction is shown to be explainable using a solvation model, whilst an approach involving 1:1 complex formation is shown not to account for the observations. It is calculated that each solvation shell, based on deutero-chloroform, contains about twelve molecules of benzene or cyclohexane. The equations produced to account for the T1 variations have been adapted to account for the 1H chemical shift variations in the same system. The shift measurements are shown to substantiate the solvent cage model with a cage capacity of twelve molecules around each chloroform molecule. Nuclear Overhauser effect data have been analysed quantitatively in a manner consistent with the solvation model. The results show that discrete shells only exist when the mole fraction of deutero-chloroform is below about 0.08.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A series of ethylene propylene terpolymer vulcanizates, prepared by varying termonomer type, cure system, cure time and cure temperature, are characterized by determining the number and type of cross-links present. The termonomers used represent the types currently available in commercial quantities. Characterization is carried out by measuring the C1 constant of the Mooney Rivlin Saunders equation before and after treatment with the chemical probes propane-2-thiol/piperidine and n-hexane thiol/piperidine, thus making it possible to calculate the relative proportions of mono-sulphidic, di-sulphidic and poly- sulphidic cross-links. The cure systems used included both sulphur and peroxide formulations. Specific physical properties are determined for each network and an attempt is made to correlate observed changes in these with variations in network structure. A survey of the economics of each formulation based on a calculated efficiency parameter for each cure system is included. Values of C1 are calculated from compression modulus data after the reliability of the technique when used with ethylene propylene terpolymers had been established. This is carried out by comparing values from both compression and extension stress strain measurements for natural rubber vulcanizates and by assessing the effects of sample dimensions and the degree of swelling. The technique of compression modulus is much more widely applicable than previously thought. The basic structure of an ethylene propylene terpolymer network appears to be independent of the type of cure system used ( sulphur based systems only), the proportions of constituent cross-links being nearly constant.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Much research is currently centred on the detection of damage in structures using vibrational data. The work presented here examined several areas of interest in support of a practical technique for identifying and locating damage within bridge structures using apparent changes in their vibrational response to known excitation. The proposed goals of such a technique included the need for the measurement system to be operated on site by a minimum number of staff and that the procedure should be as non-invasive to the bridge traffic-flow as possible. Initially the research investigated changes in the vibrational bending characteristics of two series of large-scale model bridge-beams in the laboratory and these included ordinary-reinforced and post-tensioned, prestressed designs. Each beam was progressively damaged at predetermined positions and its vibrational response to impact excitation was analysed. For the load-regime utilised the results suggested that the infuced damage manifested itself as a function of the span of a beam rather than a localised area. A power-law relating apparent damage with the applied loading and prestress levels was then proposed, together with a qualitative vibrational measure of structural damage. In parallel with the laboratory experiments a series of tests were undertaken at the sites of a number of highway bridges. The bridges selected had differing types of construction and geometric design including composite-concrete, concrete slab-and-beam, concrete-slab with supporting steel-troughing constructions together with regular-rectangular, skewed and heavily-skewed geometries. Initial investigations were made of the feasibility and reliability of various methods of structure excitation including traffic and impulse methods. It was found that localised impact using a sledge-hammer was ideal for the purposes of this work and that a cartridge `bolt-gun' could be used in some specific cases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Methods of dynamic modelling and analysis of structures, for example the finite element method, are well developed. However, it is generally agreed that accurate modelling of complex structures is difficult and for critical applications it is necessary to validate or update the theoretical models using data measured from actual structures. The techniques of identifying the parameters of linear dynamic models using Vibration test data have attracted considerable interest recently. However, no method has received a general acceptance due to a number of difficulties. These difficulties are mainly due to (i) Incomplete number of Vibration modes that can be excited and measured, (ii) Incomplete number of coordinates that can be measured, (iii) Inaccuracy in the experimental data (iv) Inaccuracy in the model structure. This thesis reports on a new approach to update the parameters of a finite element model as well as a lumped parameter model with a diagonal mass matrix. The structure and its theoretical model are equally perturbed by adding mass or stiffness and the incomplete number of eigen-data is measured. The parameters are then identified by an iterative updating of the initial estimates, by sensitivity analysis, using eigenvalues or both eigenvalues and eigenvectors of the structure before and after perturbation. It is shown that with a suitable choice of the perturbing coordinates exact parameters can be identified if the data and the model structure are exact. The theoretical basis of the technique is presented. To cope with measurement errors and possible inaccuracies in the model structure, a well known Bayesian approach is used to minimize the least squares difference between the updated and the initial parameters. The eigen-data of the structure with added mass or stiffness is also determined using the frequency response data of the unmodified structure by a structural modification technique. Thus, mass or stiffness do not have to be added physically. The mass-stiffness addition technique is demonstrated by simulation examples and Laboratory experiments on beams and an H-frame.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis explores the interrelationships between the labour process, the development of technology and patterns of gender differentiation. The introduction of front office terminals into building society branches forms the focus of the research. Case studies were carried out in nine branches, three each from three building societies. Statistical data for the whole movement and a survey of ten of the top thirty societies provided the context for the studies. In the process of the research it became clear that it was not technology itself but the way that it was used, that was the main factor in determining outcomes. The introduction of new technologies is occurring at a rapid pace, facilitated by continuing high growth rates, although front office technology could seldom be cost justified. There was great variety between societies in their operating philosophies and their reasons for and approach to computerisation, but all societies foresaw an ultimate saving in staff. Computerisation has resulted in the deskilling of the cashiering role and increased control over work at all stages. Some branch managers experienced a decrease in autonomy and an increase in control over their work. Subsequent to this deskilling there has been a greatly increased use of part time staff which has enabled costs to be reduced. There has also been a polarisation between career and non-career staff which, like the use of part time staff, has occurred along gender lines. There is considerable evidence that societies' policies, structures and managerial attitudes continue to directly and indirectly discriminate against women. It is these practices which confine women to lower grades and ensure their dependence on the family and which create the pool of cheap skilled labour that societies so willingly exploit by increasing part time work. Gender strategies enter management strategies throughout the operations of the organisation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Exploratory analysis of data seeks to find common patterns to gain insights into the structure and distribution of the data. In geochemistry it is a valuable means to gain insights into the complicated processes making up a petroleum system. Typically linear visualisation methods like principal components analysis, linked plots, or brushing are used. These methods can not directly be employed when dealing with missing data and they struggle to capture global non-linear structures in the data, however they can do so locally. This thesis discusses a complementary approach based on a non-linear probabilistic model. The generative topographic mapping (GTM) enables the visualisation of the effects of very many variables on a single plot, which is able to incorporate more structure than a two dimensional principal components plot. The model can deal with uncertainty, missing data and allows for the exploration of the non-linear structure in the data. In this thesis a novel approach to initialise the GTM with arbitrary projections is developed. This makes it possible to combine GTM with algorithms like Isomap and fit complex non-linear structure like the Swiss-roll. Another novel extension is the incorporation of prior knowledge about the structure of the covariance matrix. This extension greatly enhances the modelling capabilities of the algorithm resulting in better fit to the data and better imputation capabilities for missing data. Additionally an extensive benchmark study of the missing data imputation capabilities of GTM is performed. Further a novel approach, based on missing data, will be introduced to benchmark the fit of probabilistic visualisation algorithms on unlabelled data. Finally the work is complemented by evaluating the algorithms on real-life datasets from geochemical projects.