973 resultados para verifiable random function


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Expression of Thermus aquaticus (Taq) DNA polymerase I (pol I) in Escherichia, coli complements the growth defect caused by a temperature-sensitive mutation in the host pol I. We replaced the nucleotide sequence encoding amino acids 659-671 of the O-helix of Taq DNA pol I, corresponding to the substrate binding site, with an oligonucleotide containing random nucleotides. Functional Taq pol I mutants were selected based on colony formation at the nonpermissive temperature. By using a library with 9% random substitutions at each of 39 positions, we identified 61 active Taq pol I mutants, each of which contained from one to four amino acid substitutions. Some amino acids, such as alanine-661 and threonine-664, were tolerant of several or even many diverse replacements. In contrast, no replacements or only conservative replacements were identified at arginine-659, lysine-663, and tyrosine-671. By using a library with totally random nucleotides at five different codons (arginine-659, arginine-660, lysine-663, phenylalanine-667, and glycine-668), we confirmed that arginine-659 and lysine-663 were immutable, and observed that only tyrosine substituted for phenylalanine-667. The two immutable residues and the two residues that tolerate only highly conservative replacements lie on the side of O-helix facing the incoming deoxynucleoside triphosphate, as determined by x-ray analysis. Thus, we offer a new approach to assess concordance of the active conformation of an enzyme, as interpreted from the crystal structure, with the active conformation inferred from in vivo function.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

DNA repair alkyltransferases protect organisms against the cytotoxic, mutagenic, and carcinogenic effects of alkylating agents by transferring alkyl adducts from DNA to an active cysteine on the protein, thereby restoring the native DNA structure. We used random sequence substitutions to gain structure-function information about the human O6-methylguanine-DNA methyltransferase (EC 2.1.1.63), as well as to create active mutants. Twelve codons surrounding but not including the active cysteine were replaced by a random nucleotide sequence, and the resulting random library was selected for the ability to provide alkyltransferase-deficient Escherichia coli with resistance to the methylating agent N-methyl-N'-nitro-N-nitrosoguanidine. Few amino acid changes were tolerated in this evolutionarily conserved region of the protein. One mutation, a valine to phenylalanine change at codon 139 (V139F), was found in 70% of the selected mutants; in fact, this mutant was selected much more frequently than the wild type. V139F provided alkyltransferase-deficient bacteria with greater protection than the wild-type protein against both the cytotoxic and mutagenic effects of N-methyl-N'-nitro-N-nitrosoguanidine, increasing the D37 over 4-fold and reducing the mutagenesis rate 2.7-5.5-fold. This mutant human alkyltransferase, or others similarly created and selected, could be used to protect bone marrow cells from the cytotoxic side effects of alkylation-based chemotherapeutic regimens.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The central structural feature of natural proteins is a tightly packed and highly ordered hydrophobic core. If some measure of exquisite, native-like core packing is necessary for enzymatic function, this would constitute a significant obstacle to the development of novel enzymes, either by design or by natural or experimental evolution. To test the minimum requirements for a core to provide sufficient structural integrity for enzymatic activity, we have produced mutants of the ribonuclease barnase in which 12 of the 13 core residues have together been randomly replaced by hydrophobic alternatives. Using a sensitive biological screen, we find that a strikingly high proportion of these mutants (23%) retain enzymatic activity in vivo. Further substitution at the 13th core position shows that a similar proportion of completely random hydrophobic cores supports enzyme function. Of the active mutants produced, several have no wild-type core residues. These results imply that hydrophobicity is nearly a sufficient criterion for the construction of a functional core and, in conjunction with previous studies, that refinement of a crudely functional core entails more stringent sequence constraints than does the initial attainment of crude core function. Since attainment of crude function is the critical initial step in evolutionary innovation, the relatively scant requirements contributed by the hydrophobic core would greatly reduce the initial hurdle on the evolutionary pathway to novel enzymes. Similarly, experimental development of novel functional proteins might be simplified by limiting core design to mere specification of hydrophobicity and using iterative mutation-selection to optimize core structure.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The FANOVA (or “Sobol’-Hoeffding”) decomposition of multivariate functions has been used for high-dimensional model representation and global sensitivity analysis. When the objective function f has no simple analytic form and is costly to evaluate, computing FANOVA terms may be unaffordable due to numerical integration costs. Several approximate approaches relying on Gaussian random field (GRF) models have been proposed to alleviate these costs, where f is substituted by a (kriging) predictor or by conditional simulations. Here we focus on FANOVA decompositions of GRF sample paths, and we notably introduce an associated kernel decomposition into 4 d 4d terms called KANOVA. An interpretation in terms of tensor product projections is obtained, and it is shown that projected kernels control both the sparsity of GRF sample paths and the dependence structure between FANOVA effects. Applications on simulated data show the relevance of the approach for designing new classes of covariance kernels dedicated to high-dimensional kriging.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is unclear whether a random plasma cortisol measurement and the corticotropin (ACTH) test adequately reflect glucocorticoid secretory capacity in critical illness. This study aimed to determine whether these tests provide information representative of the 24 hour period. Plasma cortisol was measured hourly for 24 hours in 21 critically ill septic patients followed by a corticotropin test with 1 μ g dose administered intravenously. Serum and urine were analysed for ACTH and free cortisol respectively. Marked hourly variability in plasma cortisol was evident (coefficient of variation 8-30%) with no demonstrable circadian rhythm. The individual mean plasma cortisol concentrations ranged from 286 59 nmol/l to 796 &PLUSMN; 83 nmol/l. The 24 hour mean plasma cortisol was strongly correlated with both random plasma cortisol (r(2) 0.9, P< 0.0001) and the cortisol response to corticotropin (r(2) 0.72, P< 0.001). Only nine percent of patients increased their plasma cortisol by 250 nmol/l after corticotropin (euadrenal response). However, 35% of non-responders had spontaneous hourly rises > 250 nmol/l thus highlighting the limitations of a single point corticotropin test. Urinary free cortisol was elevated (865&PLUSMN; 937 nmol) in both corticotropin responders and non-responders suggesting elevated plasma free cortisol. No significant relationship was demonstrable between plasma cortisol and ACTH. We conclude that although random cortisol measurements and the low dose corticotropin tests reliably reflect the 24 hour mean cortisol in critical illness, they do not take into account the pulsatile nature of cortisol secretion. Consequently, there is the potential for erroneous conclusions about adrenal function based on a single measurement. We suggest that caution be exercised when drawing conclusions on the adequacy of adrenal function based on a single random plasma cortisol or the corticotropin test.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Motivation: The clustering of gene profiles across some experimental conditions of interest contributes significantly to the elucidation of unknown gene function, the validation of gene discoveries and the interpretation of biological processes. However, this clustering problem is not straightforward as the profiles of the genes are not all independently distributed and the expression levels may have been obtained from an experimental design involving replicated arrays. Ignoring the dependence between the gene profiles and the structure of the replicated data can result in important sources of variability in the experiments being overlooked in the analysis, with the consequent possibility of misleading inferences being made. We propose a random-effects model that provides a unified approach to the clustering of genes with correlated expression levels measured in a wide variety of experimental situations. Our model is an extension of the normal mixture model to account for the correlations between the gene profiles and to enable covariate information to be incorporated into the clustering process. Hence the model is applicable to longitudinal studies with or without replication, for example, time-course experiments by using time as a covariate, and to cross-sectional experiments by using categorical covariates to represent the different experimental classes. Results: We show that our random-effects model can be fitted by maximum likelihood via the EM algorithm for which the E(expectation) and M(maximization) steps can be implemented in closed form. Hence our model can be fitted deterministically without the need for time-consuming Monte Carlo approximations. The effectiveness of our model-based procedure for the clustering of correlated gene profiles is demonstrated on three real datasets, representing typical microarray experimental designs, covering time-course, repeated-measurement and cross-sectional data. In these examples, relevant clusters of the genes are obtained, which are supported by existing gene-function annotation. A synthetic dataset is considered too.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work deals with the random free vibration of functionally graded laminates with general boundary conditions and subjected to a temperature change, taking into account the randomness in a number of independent input variables such as Young's modulus, Poisson's ratio and thermal expansion coefficient of each constituent material. Based on third-order shear deformation theory, the mixed-type formulation and a semi-analytical approach are employed to derive the standard eigenvalue problem in terms of deflection, mid-plane rotations and stress function. A mean-centered first-order perturbation technique is adopted to obtain the second-order statistics of vibration frequencies. A detailed parametric study is conducted, and extensive numerical results are presented in both tabular and graphical forms for laminated plates that contain functionally graded material which is made of aluminum and zirconia, showing the effects of scattering in thermo-clastic material constants, temperature change, edge support condition, side-to-thickness ratio, and plate aspect ratio on the stochastic characteristics of natural frequencies. (c) 2005 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Computer models, or simulators, are widely used in a range of scientific fields to aid understanding of the processes involved and make predictions. Such simulators are often computationally demanding and are thus not amenable to statistical analysis. Emulators provide a statistical approximation, or surrogate, for the simulators accounting for the additional approximation uncertainty. This thesis develops a novel sequential screening method to reduce the set of simulator variables considered during emulation. This screening method is shown to require fewer simulator evaluations than existing approaches. Utilising the lower dimensional active variable set simplifies subsequent emulation analysis. For random output, or stochastic, simulators the output dispersion, and thus variance, is typically a function of the inputs. This work extends the emulator framework to account for such heteroscedasticity by constructing two new heteroscedastic Gaussian process representations and proposes an experimental design technique to optimally learn the model parameters. The design criterion is an extension of Fisher information to heteroscedastic variance models. Replicated observations are efficiently handled in both the design and model inference stages. Through a series of simulation experiments on both synthetic and real world simulators, the emulators inferred on optimal designs with replicated observations are shown to outperform equivalent models inferred on space-filling replicate-free designs in terms of both model parameter uncertainty and predictive variance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background & aims It has been suggested that retinal lutein may improve visual acuity for images that are illuminated by white light. Our aim was to determine the effect of a lutein and antioxidant dietary supplement on visual function. Methods A prospective, 9- and 18-month, double-masked randomised controlled trial. For the 9-month trial, 46 healthy participants were randomised (using a random number generator) to placebo (n=25) or active (n=21) groups. Twenty-nine of these subjects went on to complete 18 months of supplementation, 15 from the placebo group, and 14 from the active group. The active group supplemented daily with 6mg lutein combined with vitamins and minerals. Outcome measures were distance and near visual acuity, contrast sensitivity, and photostress recovery time. The study had 80% power at the 5% significance level for each outcome measure. Data were collected at baseline, 9, and 18 months. Results There were no statistically significant differences between groups for any of the outcome measures over 9 or 18 months. Conclusion There was no evidence of effect of 9 or 18 months of daily supplementation with a lutein-based nutritional supplement on visual function in this group of people with healthy eyes. ISRCTN78467674.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Magnetoencephalography (MEG) is a non-invasive brain imaging technique with the potential for very high temporal and spatial resolution of neuronal activity. The main stumbling block for the technique has been that the estimation of a neuronal current distribution, based on sensor data outside the head, is an inverse problem with an infinity of possible solutions. Many inversion techniques exist, all using different a-priori assumptions in order to reduce the number of possible solutions. Although all techniques can be thoroughly tested in simulation, implicit in the simulations are the experimenter's own assumptions about realistic brain function. To date, the only way to test the validity of inversions based on real MEG data has been through direct surgical validation, or through comparison with invasive primate data. In this work, we constructed a null hypothesis that the reconstruction of neuronal activity contains no information on the distribution of the cortical grey matter. To test this, we repeatedly compared rotated sections of grey matter with a beamformer estimate of neuronal activity to generate a distribution of mutual information values. The significance of the comparison between the un-rotated anatomical information and the electrical estimate was subsequently assessed against this distribution. We found that there was significant (P < 0.05) anatomical information contained in the beamformer images across a number of frequency bands. Based on the limited data presented here, we can say that the assumptions behind the beamformer algorithm are not unreasonable for the visual-motor task investigated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Properties of computing Boolean circuits composed of noisy logical gates are studied using the statistical physics methodology. A formula-growth model that gives rise to random Boolean functions is mapped onto a spin system, which facilitates the study of their typical behavior in the presence of noise. Bounds on their performance, derived in the information theory literature for specific gates, are straightforwardly retrieved, generalized and identified as the corresponding macroscopic phase transitions. The framework is employed for deriving results on error-rates at various function-depths and function sensitivity, and their dependence on the gate-type and noise model used. These are difficult to obtain via the traditional methods used in this field.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE. To investigate objectively and noninvasively the role of cognitive demand on autonomic control of systemic cardiovascular and ocular accommodative responses in emmetropes and myopes of late-onset. METHODS. Sixteen subjects (10 men, 6 women) aged between 18 and 34 years (mean ± SD: 22.6 ± 4.4 years), eight emmetropes (EMMs; mean spherical equivalent [MSE] refractive error ± SD: 0.05 ± 0.24 D) and eight with late-onset myopia (LOMs; MSE ± SD: -3.66 ± 2.31 D) participated in the study. Subjects viewed stationary numerical digits monocularly within a Badal optical system (at both 0.0 and -3.0 D) while performing a two-alternative, forced-choice paradigm that matched cognitive loading across subjects. Five individually matched cognitive levels of increasing difficulty were used in random order for each subject. Five 20-second, continuous-objective recordings of the accommodative response measured with an open-view infrared autorefractor were obtained for each cognitive level, whereas simultaneous measurement of heart rate was continuously recorded with a finger-mounted piezoelectric pulse transducer for 5 minutes. Fast Fourier transformation of cardiovascular function allowed the relative power of the autonomic components to be assessed in the frequency domain, whereas heart period gave an indication of the time-domain response. RESULTS. Increasing the cognitive demand led to a significant reduction in the accommodative response in all subjects (0.0 D: by -0.35 ± 0.33 D; -3.0 D: by -0.31 ± 0.40 D, P < 0.001). The greater lag of LOMs compared with EMMs was not significant (P = 0.07) at both distance (0.38 ± 0.35 D) and near (0.14 ± 0.42 D). Mean heart period reduced with increasing levels of workload (P < 0.0005). LOMs exhibited a relative elevation in sympathetic system activity compared to EMMs. Within refractive groups, however, accommodative shifts with increasing cognition correlated with parasympathetic activity (r = 0.99, P < 0.001), more than with sympathetic activity (r = 0.62, P > 0.05). CONCLUSIONS. In an equivalent workload paradigm, increasing cognitive demand caused a reduction in accommodative response that was attributable principally to a concurrent reduction in the relative power of the parasympathetic component of the autonomic nervous system (ANS). The disparity in accommodative response between EMMs and LOMs, however, appears to be augmented by changes in the sympathetic nervous component of the systemic ANS. Copyright © Association for Research in Vision and Ophthalmology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Random Boolean formulae, generated by a growth process of noisy logical gates are analyzed using the generating functional methodology of statistical physics. We study the type of functions generated for different input distributions, their robustness for a given level of gate error and its dependence on the formulae depth and complexity and the gates used. Bounds on their performance, derived in the information theory literature for specific gates, are straightforwardly retrieved, generalized and identified as the corresponding typical-case phase transitions. Results for error-rates, function-depth and sensitivity of the generated functions are obtained for various gate-type and noise models. © 2010 IOP Publishing Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Pharmacy has experienced both incomplete professionalization and deprofessionalization. Since the late 1970s, a concerted attempt has been made to re-professionalize pharmacy in the United Kingdom (UK) through role extension—a key feature of which has been a drive for greater pharmacy involvement in public health. However, the continual corporatization of the UK community pharmacy sector may reduce the professional autonomy of pharmacists and may threaten to constrain attempts at reprofessionalization. Objectives The objectives of the research: to examine the public health activities of community pharmacists in the UK; to explore the attitudes of community pharmacists toward recent relevant UK policy and barriers to the development of their public health function; and, to investigate associations between activity, attitudes, and the type of community pharmacy worked in (eg, supermarket, chain, independent). Methods A self-completion postal questionnaire was sent to a random sample of practicing community pharmacists, stratified for country and sex, within Great Britain (n = 1998), with a follow-up to nonresponders 4 weeks later. Data were analyzed using SPSS (SPSS Inc., Chicago, IL, USA) (v12.0). A final response rate of 51% (n = 1023/1998) was achieved. Results The level of provision of emergency hormonal contraception on a patient group direction, supervised administration of medicines, and needle-exchange schemes was lower in supermarket pharmacies than in the other types of pharmacy. Respondents believed that supermarkets and the major multiple pharmacy chains held an advantageous position in terms of attracting financing for service development despite suggesting that the premises of such pharmacies may not be the most suitable for the provision of such services. Conclusions A mixed market in community pharmacy may be required to maintain a comprehensive range of pharmacy-based public health services and provide maximum benefit to all patients. Longitudinal monitoring is recommended to ensure that service provision is adequate across the pharmacy network.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis studied the effect of (i) the number of grating components and (ii) parameter randomisation on root-mean-square (r.m.s.) contrast sensitivity and spatial integration. The effectiveness of spatial integration without external spatial noise depended on the number of equally spaced orientation components in the sum of gratings. The critical area marking the saturation of spatial integration was found to decrease when the number of components increased from 1 to 5-6 but increased again at 8-16 components. The critical area behaved similarly as a function of the number of grating components when stimuli consisted of 3, 6 or 16 components with different orientations and/or phases embedded in spatial noise. Spatial integration seemed to depend on the global Fourier structure of the stimulus. Spatial integration was similar for sums of two vertical cosine or sine gratings with various Michelson contrasts in noise. The critical area for a grating sum was found to be a sum of logarithmic critical areas for the component gratings weighted by their relative Michelson contrasts. The human visual system was modelled as a simple image processor where the visual stimuli is first low-pass filtered by the optical modulation transfer function of the human eye and secondly high-pass filtered, up to the spatial cut-off frequency determined by the lowest neural sampling density, by the neural modulation transfer function of the visual pathways. The internal noise is then added before signal interpretation occurs in the brain. The detection is mediated by a local spatially windowed matched filter. The model was extended to include complex stimuli and its applicability to the data was found to be successful. The shape of spatial integration function was similar for non-randomised and randomised simple and complex gratings. However, orientation and/or phase randomised reduced r.m.s contrast sensitivity by a factor of 2. The effect of parameter randomisation on spatial integration was modelled under the assumption that human observers change the observer strategy from cross-correlation (i.e., a matched filter) to auto-correlation detection when uncertainty is introduced to the task. The model described the data accurately.