936 resultados para germs of holomorphic generalized functions


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Eukaryotic DNA m5C methyltransferases (MTases) play a major role in many epigenetic regulatory processes like genomic imprinting, X-chromosome inactivation, silencing of transposons and gene expression. Members of the two DNA m5C MTase families, Dnmt1 and Dnmt3, are relatively well studied and many details of their biological functions, biochemical properties as well as interaction partners are known. In contrast, the biological functions of the highly conserved Dnmt2 family, which appear to have non-canonical dual substrate specificity, remain enigmatic despite the efforts of many researchers. The genome of the social amoeba Dictyostelium encodes Dnmt2-homolog, the DnmA, as the only DNA m5C MTase which allowed us to study Dnmt2 function in this organism without interference by the other enzymes. The dnmA gene can be easily disrupted but the knock-out clones did not show obvious phenotypes under normal lab conditions, suggesting that the function of DnmA is not vital for the organism. It appears that the dnmA gene has a low expression profile during vegetative growth and is only 5-fold upregulated during development. Fluorescence microscopy indicated that DnmA-GFP fusions were distributed between both the nucleus and cytoplasm with some enrichment in nuclei. Interestingly, the experiments showed specific dynamics of DnmA-GFP distribution during the cell cycle. The proteins colocalized with DNA in the interphase and were mainly removed from nuclei during mitosis. DnmA functions as an active DNA m5C MTase in vivo and is responsible for weak but detectable DNA methylation of several regions in the Dictyostelium genome. Nevertheless, gel retardation assays showed only slightly higher affinity of the enzyme to dsDNA compared to ssDNA and no specificity towards various sequence contexts, although weak but detectable specificity towards AT-rich sequences was observed. This could be due to intrinsic curvature of such sequences. Furthermore, DnmA did not show denaturant-resistant covalent complexes with dsDNA in vitro, although it could form covalent adducts with ssDNA. Low binding and methyltransfer activity in vitro suggest the necessity of additional factor in DnmA function. Nevertheless, no candidates could be identified in affinity purification experiments with different tagged DnmA fusions. In this respect, it should be noted that tagged DnmA fusion preparations from Dictyostelium showed somewhat higher activity in both covalent adduct formation and methylation assays than DnmA expressed in E.coli. Thus, the presence of co-purified factors cannot be excluded. The low efficiency of complex formation by the recombinant enzyme and the failure to define interacting proteins that could be required for DNA methylation in vivo, brought up the assumption that post-translational modifications could influence target recognition and enzymatic activity. Indeed, sites of phosphorylation, methylation and acetylation were identified within the target recognition domain (TRD) of DnmA by mass spectrometry. For phosphorylation, the combination of MS data and bioinformatic analysis revealed that some of the sites could well be targets for specific kinases in vivo. Preliminary 3D modeling of DnmA protein based on homology with hDNMT2 allowed us to show that several identified phosphorylation sites located on the surface of the molecule, where they would be available for kinases. The presence of modifications almost solely within the TRD domain of DnmA could potentially modulate the mode of its interaction with the target nucleic acids. DnmA was able to form denaturant-resistant covalent intermediates with several Dictyostelium tRNAs, using as a target C38 in the anticodon loop. The formation of complexes not always correlated with the data from methylation assays, and seemed to be dependent on both sequence and structure of the tRNA substrate. The pattern, previously suggested by the Helm group for optimal methyltransferase activity of hDNMT2, appeared to contribute significantly in the formation of covalent adducts but was not the only feature of the substrate required for DnmA and hDNMT2 functions. Both enzymes required Mg2+ to form covalent complexes, which indicated that the specific structure of the target tRNA was indispensable. The dynamics of covalent adduct accumulation was different for DnmA and different tRNAs. Interestingly, the profiles of covalent adduct accumulation for different tRNAs were somewhat similar for DnmA and hDNMT2 enzymes. According to the proposed catalytic mechanism for DNA m5C MTases, the observed denaturant-resistant complexes corresponded to covalent enamine intermediates. The apparent discrepancies in the data from covalent complex formation and methylation assays may be interpreted by the possibility of alternative pathways of the catalytic mechanism, leading not to methylation but to exchange or demethylation reactions. The reversibility of enamine intermediate formation should also be considered. Curiously, native gel retardation assays showed no or little difference in binding affinities of DnmA to different RNA substrates and thus the absence of specificity in the initial enzyme binding. The meaning of the tRNA methylation as well as identification of novel RNA substrates in vivo should be the aim of further experiments.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes a trainable system capable of tracking faces and facialsfeatures like eyes and nostrils and estimating basic mouth features such as sdegrees of openness and smile in real time. In developing this system, we have addressed the twin issues of image representation and algorithms for learning. We have used the invariance properties of image representations based on Haar wavelets to robustly capture various facial features. Similarly, unlike previous approaches this system is entirely trained using examples and does not rely on a priori (hand-crafted) models of facial features based on optical flow or facial musculature. The system works in several stages that begin with face detection, followed by localization of facial features and estimation of mouth parameters. Each of these stages is formulated as a problem in supervised learning from examples. We apply the new and robust technique of support vector machines (SVM) for classification in the stage of skin segmentation, face detection and eye detection. Estimation of mouth parameters is modeled as a regression from a sparse subset of coefficients (basis functions) of an overcomplete dictionary of Haar wavelets.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A new method for the automated selection of colour features is described. The algorithm consists of two stages of processing. In the first, a complete set of colour features is calculated for every object of interest in an image. In the second stage, each object is mapped into several n-dimensional feature spaces in order to select the feature set with the smallest variables able to discriminate the remaining objects. The evaluation of the discrimination power for each concrete subset of features is performed by means of decision trees composed of linear discrimination functions. This method can provide valuable help in outdoor scene analysis where no colour space has been demonstrated as being the most suitable. Experiment results recognizing objects in outdoor scenes are reported

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The State-building process must be understood through the study of the agencies in charge of each of its regulatory functions. One such function is the regulation of property rights. During the Liberal Republic, as a reaction to the massive mobilization,new tools to better regulate property rights were promoted: colonization, parceling, the award of public lands and, at the end, a new legal framework. In spite of its purposes, they faced and failed to solve the challenges every organization experiences when growing: resource scarcity, controlling its agents, and keeping technical simplicity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Lately, the study of prefrontal executive functions in grade scholars has noticeably increased. The aim of this study is to investigate the influence of age and socioeconomic status (sEs) on executive tasks performance and to analyze those socioeconomic variables that predict a better execution. A sample of 254 children aged between 7 and 12 years from the city of santa Fe, Argentina and belonging to different socioeconomic status were tested. A bat- tery of executive functions sensitive to prefrontal function was used to obtain the results. These in- dicate a significant influence of age and SES on executive functions. The cognitive patterns follow a different path according to the development and sEs effect. Besides, it is revealed a pattern of low cognitive functioning in low-sEs children in all executive functions. Finally, from the variables included in this study, it was found that only the educational level of the mother and the housing conditions are associated to the children’s executive function. The results are discussed in terms of the influence of the cerebral maturation and the envi- ronmental variables in the executive functioning.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

En este trabajo se implementa una metodología para incluir momentos de orden superior en la selección de portafolios, haciendo uso de la Distribución Hiperbólica Generalizada, para posteriormente hacer un análisis comparativo frente al modelo de Markowitz.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A new dynamic model of water quality, Q(2), has recently been developed, capable of simulating large branched river systems. This paper describes the application of a generalized sensitivity analysis (GSA) to Q(2) for single reaches of the River Thames in southern England. Focusing on the simulation of dissolved oxygen (DO) (since this may be regarded as a proxy for the overall health of a river); the GSA is used to identify key parameters controlling model behavior and provide a probabilistic procedure for model calibration. It is shown that, in the River Thames at least, it is more important to obtain high quality forcing functions than to obtain improved parameter estimates once approximate values have been estimated. Furthermore, there is a need to ensure reasonable simulation of a range of water quality determinands, since a focus only on DO increases predictive uncertainty in the DO simulations. The Q(2) model has been applied here to the River Thames, but it has a broad utility for evaluating other systems in Europe and around the world.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this article, we use the no-response test idea, introduced in Luke and Potthast (2003) and Potthast (Preprint) and the inverse obstacle problem, to identify the interface of the discontinuity of the coefficient gamma of the equation del (.) gamma(x)del + c(x) with piecewise regular gamma and bounded function c(x). We use infinitely many Cauchy data as measurement and give a reconstructive method to localize the interface. We will base this multiwave version of the no-response test on two different proofs. The first one contains a pointwise estimate as used by the singular sources method. The second one is built on an energy (or an integral) estimate which is the basis of the probe method. As a conclusion of this, the probe and the singular sources methods are equivalent regarding their convergence and the no-response test can be seen as a unified framework for these methods. As a further contribution, we provide a formula to reconstruct the values of the jump of gamma(x), x is an element of partial derivative D at the boundary. A second consequence of this formula is that the blow-up rate of the indicator functions of the probe and singular sources methods at the interface is given by the order of the singularity of the fundamental solution.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We apply a new X-ray scattering approach to the study of melt-spun filaments of tri-block and random terpolymers prepared from lactide, caprolactone and glycolide. Both terpolymers contain random sequences, in both cases the overall fraction of lactide units is similar to 0.7 and C-13 and H-1 NMR shows the lactide sequence length to be similar to 9-10. A novel representation of the X-ray fibre pattern as series of spherical harmonic functions considerably facilitates the comparison of the scattering from the minority crystalline phase with hot drawn fibres prepared from the poly(L-lactide) homopolymer. Although the fibres exhibit rather disordered structures we show that the crystal structure is equivalent to that displayed by poly(L-lactide) for both the block and random terpolymers. There are variations in the development of a two-phase structure which reflect the differences in the chain architectures. There is evidence that the random terpolymer includes non-lactide units in to the crystal interfaces to achieve a well defined two-phase structure. (c) 2005 Published by Elsevier Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

How do organizations previously dominated by the state develop dynamic capabilities that would support their growth in a competitive market economy? We develop a theoretical framework of organizational transformation that explains the processes by which organizations learn and develop dynamic capabilities in transition economies. Specifically, the framework theorizes about the importance of, and inter-relationships between, leadership, organizational learning, dynamic capabilities, and performance over three stages of transformation. Propositions derived from this framework explain the pre-conditions enabling organizational learning, the linkages between types of learning and functions of dynamic capabilities, and the feedback from dynamic capabilities to organizational learning that allows firms in transition economies to regain their footing and build long-term competitive advantage. We focus on transition contexts, where these processes have been magnified and thus offer new insights into strategizing in radically altered environments.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We use proper orthogonal decomposition (POD) to study a transient teleconnection event at the onset of the 2001 planet-encircling dust storm on Mars, in terms of empirical orthogonal functions (EOFs). There are several differences between this and previous studies of atmospheric events using EOFs. First, instead of using a single variable such as surface pressure or geopotential height on a given pressure surface, we use a dataset describing the evolution in time of global and fully three-dimensional atmospheric fields such as horizontal velocity and temperature. These fields are produced by assimilating Thermal Emission Spectrometer observations from NASA's Mars Global Surveyor spacecraft into a Mars general circulation model. We use total atmospheric energy (TE) as a physically meaningful quantity which weights the state variables. Second, instead of adopting the EOFs to define teleconnection patterns as planetary-scale correlations that explain a large portion of long time-scale variability, we use EOFs to understand transient processes due to localised heating perturbations that have implications for the atmospheric circulation over distant regions. The localised perturbation is given by anomalous heating due to the enhanced presence of dust around the northern edge of the Hellas Planitia basin on Mars. We show that the localised disturbance is seemingly restricted to a small number (a few tens) of EOFs. These can be classified as low-order, transitional, or high-order EOFs according to the TE amount they explain throughout the event. Despite the global character of the EOFs, they show the capability of accounting for the localised effects of the perturbation via the presence of specific centres of action. We finally discuss possible applications for the study of terrestrial phenomena with similar characteristics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recent studies into price transmission have recognized the important role played by transport and transaction costs. Threshold models are one approach to accommodate such costs. We develop a generalized Threshold Error Correction Model to test for the presence and form of threshold behavior in price transmission that is symmetric around equilibrium. We use monthly wheat, maize, and soya prices from the United States, Argentina, and Brazil to demonstrate this model. Classical estimation of these generalized models can present challenges but Bayesian techniques avoid many of these problems. Evidence for thresholds is found in three of the five commodity price pairs investigated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a new method for the inclusion of nonlinear demand and supply relationships within a linear programming model. An existing method for this purpose is described first and its shortcomings are pointed out before showing how the new approach overcomes those difficulties and how it provides a more accurate and 'smooth' (rather than a kinked) approximation of the nonlinear functions as well as dealing with equilibrium under perfect competition instead of handling just the monopolistic situation. The workings of the proposed method are illustrated by extending a previously available sectoral model for the UK agriculture.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: The present paper investigates the question of a suitable basic model for the number of scrapie cases in a holding and applications of this knowledge to the estimation of scrapie-ffected holding population sizes and adequacy of control measures within holding. Is the number of scrapie cases proportional to the size of the holding in which case it should be incorporated into the parameter of the error distribution for the scrapie counts? Or, is there a different - potentially more complex - relationship between case count and holding size in which case the information about the size of the holding should be better incorporated as a covariate in the modeling? Methods: We show that this question can be appropriately addressed via a simple zero-truncated Poisson model in which the hypothesis of proportionality enters as a special offset-model. Model comparisons can be achieved by means of likelihood ratio testing. The procedure is illustrated by means of surveillance data on classical scrapie in Great Britain. Furthermore, the model with the best fit is used to estimate the size of the scrapie-affected holding population in Great Britain by means of two capture-recapture estimators: the Poisson estimator and the generalized Zelterman estimator. Results: No evidence could be found for the hypothesis of proportionality. In fact, there is some evidence that this relationship follows a curved line which increases for small holdings up to a maximum after which it declines again. Furthermore, it is pointed out how crucial the correct model choice is when applied to capture-recapture estimation on the basis of zero-truncated Poisson models as well as on the basis of the generalized Zelterman estimator. Estimators based on the proportionality model return very different and unreasonable estimates for the population sizes. Conclusion: Our results stress the importance of an adequate modelling approach to the association between holding size and the number of cases of classical scrapie within holding. Reporting artefacts and speculative biological effects are hypothesized as the underlying causes of the observed curved relationship. The lack of adjustment for these artefacts might well render ineffective the current strategies for the control of the disease.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Kinetic studies on the AR (aldose reductase) protein have shown that it does not behave as a classical enzyme in relation to ring aldose sugars. As with non-enzymatic glycation reactions, there is probably a free radical element involved derived from monosaccharide autoxidation. in the case of AR, there is free radical oxidation of NADPH by autoxidizing monosaccharides, which is enhanced in the presence of the NADPH-binding protein. Thus any assay for AR based on the oxidation of NADPH in the presence of autoxidizing monosaccharides is invalid, and tissue AR measurements based on this method are also invalid, and should be reassessed. AR exhibits broad specificity for both hydrophilic and hydrophobic aldehydes that suggests that the protein may be involved in detoxification. The last thing we would want to do is to inhibit it. ARIs (AR inhibitors) have a number of actions in the cell which are not specific, and which do not involve them binding to AR. These include peroxy-radical scavenging and effects of metal ion chelation. The AR/ARI story emphasizes the importance of correct experimental design in all biocatalytic experiments. Developing the use of Bayesian utility functions, we have used a systematic method to identify the optimum experimental designs for a number of kinetic model data sets. This has led to the identification of trends between kinetic model types, sets of design rules and the key conclusion that such designs should be based on some prior knowledge of K-m and/or the kinetic model. We suggest an optimal and iterative method for selecting features of the design such as the substrate range, number of measurements and choice of intermediate points. The final design collects data suitable for accurate modelling and analysis and minimizes the error in the parameters estimated, and is suitable for simple or complex steady-state models.