905 resultados para Quantum computational complexity


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis presents methods for locating and analyzing cis-regulatory DNA elements involved with the regulation of gene expression in multicellular organisms. The regulation of gene expression is carried out by the combined effort of several transcription factor proteins collectively binding the DNA on the cis-regulatory elements. Only sparse knowledge of the 'genetic code' of these elements exists today. An automatic tool for discovery of putative cis-regulatory elements could help their experimental analysis, which would result in a more detailed view of the cis-regulatory element structure and function. We have developed a computational model for the evolutionary conservation of cis-regulatory elements. The elements are modeled as evolutionarily conserved clusters of sequence-specific transcription factor binding sites. We give an efficient dynamic programming algorithm that locates the putative cis-regulatory elements and scores them according to the conservation model. A notable proportion of the high-scoring DNA sequences show transcriptional enhancer activity in transgenic mouse embryos. The conservation model includes four parameters whose optimal values are estimated with simulated annealing. With good parameter values the model discriminates well between the DNA sequences with evolutionarily conserved cis-regulatory elements and the DNA sequences that have evolved neutrally. In further inquiry, the set of highest scoring putative cis-regulatory elements were found to be sensitive to small variations in the parameter values. The statistical significance of the putative cis-regulatory elements is estimated with the Two Component Extreme Value Distribution. The p-values grade the conservation of the cis-regulatory elements above the neutral expectation. The parameter values for the distribution are estimated by simulating the neutral DNA evolution. The conservation of the transcription factor binding sites can be used in the upstream analysis of regulatory interactions. This approach may provide mechanistic insight to the transcription level data from, e.g., microarray experiments. Here we give a method to predict shared transcriptional regulators for a set of co-expressed genes. The EEL (Enhancer Element Locator) software implements the method for locating putative cis-regulatory elements. The software facilitates both interactive use and distributed batch processing. We have used it to analyze the non-coding regions around all human genes with respect to the orthologous regions in various other species including mouse. The data from these genome-wide analyzes is stored in a relational database which is used in the publicly available web services for upstream analysis and visualization of the putative cis-regulatory elements in the human genome.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In visual object detection and recognition, classifiers have two interesting characteristics: accuracy and speed. Accuracy depends on the complexity of the image features and classifier decision surfaces. Speed depends on the hardware and the computational effort required to use the features and decision surfaces. When attempts to increase accuracy lead to increases in complexity and effort, it is necessary to ask how much are we willing to pay for increased accuracy. For example, if increased computational effort implies quickly diminishing returns in accuracy, then those designing inexpensive surveillance applications cannot aim for maximum accuracy at any cost. It becomes necessary to find trade-offs between accuracy and effort. We study efficient classification of images depicting real-world objects and scenes. Classification is efficient when a classifier can be controlled so that the desired trade-off between accuracy and effort (speed) is achieved and unnecessary computations are avoided on a per input basis. A framework is proposed for understanding and modeling efficient classification of images. Classification is modeled as a tree-like process. In designing the framework, it is important to recognize what is essential and to avoid structures that are narrow in applicability. Earlier frameworks are lacking in this regard. The overall contribution is two-fold. First, the framework is presented, subjected to experiments, and shown to be satisfactory. Second, certain unconventional approaches are experimented with. This allows the separation of the essential from the conventional. To determine if the framework is satisfactory, three categories of questions are identified: trade-off optimization, classifier tree organization, and rules for delegation and confidence modeling. Questions and problems related to each category are addressed and empirical results are presented. For example, related to trade-off optimization, we address the problem of computational bottlenecks that limit the range of trade-offs. We also ask if accuracy versus effort trade-offs can be controlled after training. For another example, regarding classifier tree organization, we first consider the task of organizing a tree in a problem-specific manner. We then ask if problem-specific organization is necessary.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The paradigm of computational vision hypothesizes that any visual function -- such as the recognition of your grandparent -- can be replicated by computational processing of the visual input. What are these computations that the brain performs? What should or could they be? Working on the latter question, this dissertation takes the statistical approach, where the suitable computations are attempted to be learned from the natural visual data itself. In particular, we empirically study the computational processing that emerges from the statistical properties of the visual world and the constraints and objectives specified for the learning process. This thesis consists of an introduction and 7 peer-reviewed publications, where the purpose of the introduction is to illustrate the area of study to a reader who is not familiar with computational vision research. In the scope of the introduction, we will briefly overview the primary challenges to visual processing, as well as recall some of the current opinions on visual processing in the early visual systems of animals. Next, we describe the methodology we have used in our research, and discuss the presented results. We have included some additional remarks, speculations and conclusions to this discussion that were not featured in the original publications. We present the following results in the publications of this thesis. First, we empirically demonstrate that luminance and contrast are strongly dependent in natural images, contradicting previous theories suggesting that luminance and contrast were processed separately in natural systems due to their independence in the visual data. Second, we show that simple cell -like receptive fields of the primary visual cortex can be learned in the nonlinear contrast domain by maximization of independence. Further, we provide first-time reports of the emergence of conjunctive (corner-detecting) and subtractive (opponent orientation) processing due to nonlinear projection pursuit with simple objective functions related to sparseness and response energy optimization. Then, we show that attempting to extract independent components of nonlinear histogram statistics of a biologically plausible representation leads to projection directions that appear to differentiate between visual contexts. Such processing might be applicable for priming, \ie the selection and tuning of later visual processing. We continue by showing that a different kind of thresholded low-frequency priming can be learned and used to make object detection faster with little loss in accuracy. Finally, we show that in a computational object detection setting, nonlinearly gain-controlled visual features of medium complexity can be acquired sequentially as images are encountered and discarded. We present two online algorithms to perform this feature selection, and propose the idea that for artificial systems, some processing mechanisms could be selectable from the environment without optimizing the mechanisms themselves. In summary, this thesis explores learning visual processing on several levels. The learning can be understood as interplay of input data, model structures, learning objectives, and estimation algorithms. The presented work adds to the growing body of evidence showing that statistical methods can be used to acquire intuitively meaningful visual processing mechanisms. The work also presents some predictions and ideas regarding biological visual processing.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis presents a highly sensitive genome wide search method for recessive mutations. The method is suitable for distantly related samples that are divided into phenotype positives and negatives. High throughput genotype arrays are used to identify and compare homozygous regions between the cohorts. The method is demonstrated by comparing colorectal cancer patients against unaffected references. The objective is to find homozygous regions and alleles that are more common in cancer patients. We have designed and implemented software tools to automate the data analysis from genotypes to lists of candidate genes and to their properties. The programs have been designed in respect to a pipeline architecture that allows their integration to other programs such as biological databases and copy number analysis tools. The integration of the tools is crucial as the genome wide analysis of the cohort differences produces many candidate regions not related to the studied phenotype. CohortComparator is a genotype comparison tool that detects homozygous regions and compares their loci and allele constitutions between two sets of samples. The data is visualised in chromosome specific graphs illustrating the homozygous regions and alleles of each sample. The genomic regions that may harbour recessive mutations are emphasised with different colours and a scoring scheme is given for these regions. The detection of homozygous regions, cohort comparisons and result annotations are all subjected to presumptions many of which have been parameterized in our programs. The effect of these parameters and the suitable scope of the methods have been evaluated. Samples with different resolutions can be balanced with the genotype estimates of their haplotypes and they can be used within the same study.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The possible nonplanar distortions of the amide group in formamide, acetamide, N-methylacetamide, and N-ethylacetamide have been examined using CNDO/2 and INDO methods. The predictions from these methods are compared with the results obtained from X-ray and neutron diffraction studies on crystals of small open peptides, cyclic peptides, and amides. It is shown that the INDO results are in good agreement with observations, and that the dihedral angles N and defining the nonplanarity of the amide unit are correlated approximately by the relation N = -2, while C is small and uncorrelated with . The present study indicates that the nonplanar distortions at the nitrogen atom of the peptide unit may have to be taken into consideration, in addition to the variation in the dihedral angles (,), in working out polypeptide and protein structures.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this work we numerically model isothermal turbulent swirling flow in a cylindrical burner. Three versions of the RNG k-epsilon model are assessed against performance of the standard k-epsilon model. Sensitivity of numerical predictions to grid refinement, differing convective differencing schemes and choice of (unknown) inlet dissipation rate, were closely scrutinised to ensure accuracy. Particular attention is paid to modelling the inlet conditions to within the range of uncertainty of the experimental data, as model predictions proved to be significantly sensitive to relatively small changes in upstream flow conditions. We also examine the characteristics of the swirl--induced recirculation zone predicted by the models over an extended range of inlet conditions. Our main findings are: - (i) the standard k-epsilon model performed best compared with experiment; - (ii) no one inlet specification can simultaneously optimize the performance of the models considered; - (iii) the RNG models predict both single-cell and double-cell IRZ characteristics, the latter both with and without additional internal stagnation points. The first finding indicates that the examined RNG modifications to the standard k-e model do not result in an improved eddy viscosity based model for the prediction of swirl flows. The second finding suggests that tuning established models for optimal performance in swirl flows a priori is not straightforward. The third finding indicates that the RNG based models exhibit a greater variety of structural behaviour, despite being of the same level of complexity as the standard k-e model. The plausibility of the predicted IRZ features are discussed in terms of known vortex breakdown phenomena.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We offer a procedure for evaluating the forces exerted by solitons of weak-coupling field theories on one another. We illustrate the procedure for the kink and the antikink of the two-dimensional φ4 theory. To do this, we construct analytically a static solution of the theory which can be interpreted as a kink and an antikink held a distance R apart. This leads to a definition of the potential energy U(R) for the pair, which is seen to have all the expected features. A corresponding evaluation is also done for U(R) between a soliton and an antisoliton of the sine-Gordon theory. When this U(R) is inserted into a nonrelativistic two-body problem for the pair, it yields a set of bound states and phase shifts. These are found to agree with exact results known for the sine-Gordon field theory in those regions where U(R) is expected to be significant, i.e., when R is large compared to the soliton size. We take this agreement as support that our procedure for defining U(R) yields the correct description of the dynamics of well-separated soliton pairs. An important feature of U(R) is that it seems to give strong intersoliton forces when the coupling constant is small, as distinct from the forces between the ordinary quanta of the theory. We suggest that this is a general feature of a class of theories, and emphasize the possible relevance of this feature to real strongly interacting hadrons.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Mammalian heparanase is an endo-β-glucuronidase associated with cell invasion in cancer metastasis, angiogenesis and inflammation. Heparanase cleaves heparan sulfate proteoglycans in the extracellular matrix and basement membrane, releasing heparin/heparan sulfate oligosaccharides of appreciable size. This in turn causes the release of growth factors, which accelerate tumor growth and metastasis. Heparanase has two glycosaminoglycan-binding domains; however, no three-dimensional structure information is available for human heparanase that can provide insights into how the two domains interact to degrade heparin fragments. We have constructed a new homology model of heparanase that takes into account the most recent structural and bioinformatics data available. Heparin analogs and glycosaminoglycan mimetics were computationally docked into the active site with energetically stable ring conformations and their interaction energies were compared. The resulting docked structures were used to propose a model for substrates and conformer selectivity based on the dimensions of the active site. The docking of substrates and inhibitors indicates the existence of a large binding site extending at least two saccharide units beyond the cleavage site (toward the nonreducing end) and at least three saccharides toward the reducing end (toward heparin-binding site 2). The docking of substrates suggests that heparanase recognizes the N-sulfated and O-sulfated glucosamines at subsite +1 and glucuronic acid at the cleavage site, whereas in the absence of 6-O-sulfation in glucosamine, glucuronic acid is docked at subsite +2. These findings will help us to focus on the rational design of heparanase-inhibiting molecules for anticancer drug development by targeting the two heparin/heparan sulfate recognition domains.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The past several years have seen significant advances in the development of computational methods for the prediction of the structure and interactions of coiled-coil peptides. These methods are generally based on pairwise correlations of amino acids, helical propensity, thermal melts and the energetics of sidechain interactions, as well as statistical patterns based on Hidden Markov Model (HMM) and Support Vector Machine (SVM) techniques. These methods are complemented by a number of public databases that contain sequences, motifs, domains and other details of coiled-coil structures identified by various algorithms. Some of these computational methods have been developed to make predictions of coiled-coil structure on the basis of sequence information; however, structural predictions of the oligomerisation state of these peptides still remains largely an open question due to the dynamic behaviour of these molecules. This review focuses on existing in silico methods for the prediction of coiled-coil peptides of functional importance using sequence and/or three-dimensional structural data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Information exchange (IE) is a critical component of the complex collaborative medication process in residential aged care facilities (RACFs). Designing information and communication technology (ICT) to support complex processes requires a profound understanding of the IE that underpins their execution. There is little existing research that investigates the complexity of IE in RACFs and its impact on ICT design. The aim of this study was thus to undertake an in-depth exploration of the IE process involved in medication management to identify its implications for the design of ICT. The study was undertaken at a large metropolitan facility in NSW, Australia. A total of three focus groups, eleven interviews and two observation sessions were conducted between July to August 2010. Process modelling was undertaken by translating the qualitative data via in-depth iterative inductive analysis. The findings highlight the complexity and collaborative nature of IE in RACF medication management. These models emphasize the need to: a) deal with temporal complexity; b) rely on an interdependent set of coordinative artefacts; and c) use synchronous communication channels for coordination. Taken together these are crucial aspects of the IE process in RACF medication management that need to be catered for when designing ICT in this critical area. This study provides important new evidence of the advantages of viewing process as a part of a system rather than as segregated tasks as a means of identifying the latent requirements for ICT design and that is able to support complex collaborative processes like medication management in RACFs. © 2012 IEEE.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The complexity of life is based on an effective energy transduction machinery, which has evolved during the last 3.5 billion years. In aerobic life, the utilization of the high oxidizing potential of molecular oxygen powers this machinery. Oxygen is safely reduced by a membrane bound enzyme, cytochrome c oxidase (CcO), to produce an electrochemical proton gradient over the mitochondrial or bacterial membrane. This gradient is used for energy-requiring reactions such as synthesis of ATP by F0F1-ATPase and active transport. In this thesis, the molecular mechanism by which CcO couples the oxygen reduction chemistry to proton-pumping has been studied by theoretical computer simulations. By building both classical and quantum mechanical model systems based on the X-ray structure of CcO from Bos taurus, the dynamics and energetics of the system were studied in different intermediate states of the enzyme. As a result of this work, a mechanism was suggested by which CcO can prevent protons from leaking backwards in proton-pumping. The use and activation of two proton conducting channels were also enlightened together with a mechanism by which CcO sorts the chemical protons from pumped protons. The latter problem is referred to as the gating mechanism of CcO, and has remained a challenge in the bioenergetics field for more than three decades. Furthermore, a new method for deriving charge parameters for classical simulations of complex metalloenzymes was developed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Because of limited sensor and communication ranges, designing efficient mechanisms for cooperative tasks is difficult. In this article, several negotiation schemes for multiple agents performing a cooperative task are presented. The negotiation schemes provide suboptimal solutions, but have attractive features of fast decision-making, and scalability to large number of agents without increasing the complexity of the algorithm. A software agent architecture of the decision-making process is also presented. The effect of the magnitude of information flow during the negotiation process is studied by using different models of the negotiation scheme. The performance of the various negotiation schemes, using different information structures, is studied based on the uncertainty reduction achieved for a specified number of search steps. The negotiation schemes perform comparable to that of optimal strategy in terms of uncertainty reduction and also require very low computational time, similar to 7 per cent to that of optimal strategy. Finally, analysis on computational and communication requirement for the negotiation schemes is carried out.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper describes, formalizes and implements an approach to computational creativity based on situated interpretation. The paper introduces the notions of framing and reframing of conceptual spaces based on empirical studies as the driver for this research. It uses concepts from situated cognition, and situated interpretation in particular, to be the basis of a formal model of the movement between conceptual spaces. This model is implemented using rules within interacting neural networks. This implementation demonstrates behaviour similar to that observed in studies of human designers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we study the Einstein's photoemission from III-V, II-VI, IV-VI and HgTe/CdTe quantum well superlattices (QWSLs) with graded interfaces and quantum well effective mass superlattices in the presence of a quantizing magnetic field on the basis of newly formulated dispersion relations in the respective cases. Besides, the same has been studied from the afore-mentioned quantum dot superlattices and it appears that the photoemission oscillates with increasing carrier degeneracy and quantizing magnetic field in different manners. In addition, the photoemission oscillates with film thickness and increasing photon energy in quantum steps together with the fact that the solution of the Boltzmann transport equation will introduce new physical ideas and new experimental findings under different external conditions. The influence of band structure is apparent from all the figures and we have suggested three applications of the analyses of this paper in the fields of superlattices and microstructures.