14 resultados para EFFICIENT ROUTE

em Helda - Digital Repository of University of Helsinki


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Breast cancer is the most common cancer in women in Western countries. In the early stages of development most breast cancers are hormone-dependent, and estrogens, especially estradiol, have a pivotal role in their development and progression. One approach to the treatment of hormone-dependent breast cancers is to block the formation of the active estrogens by inhibiting the action of the steroid metabolising enzymes. 17beta-Hydroxysteroid dehydrogenase type 1 (17beta-HSD1) is a key enzyme in the biosynthesis of estradiol, the most potent female sex hormone. The 17beta-HSD1 enzyme catalyses the final step and converts estrone into the biologically active estradiol. Blocking 17beta-HSD1 activity with a specific enzyme inhibitor could provide a means to reduce circulating and tumour estradiol levels and thus promote tumour regression. In recent years 17beta-HSD1 has been recognised as an important drug target. Some inhibitors of 17beta-HSD1 have been reported, however, there are no inhibitors on the market nor have clinical trials been announced. The majority of known 17beta-HSD1 inhibitors are based on steroidal structures, while relatively little has been reported on non-steroidal inhibitors. As compared with 17beta-HSD1 inhibitors based on steroidal structures, non-steroidal compounds could have advantages of synthetic accessibility, drug-likeness, selectivity and non-estrogenicity. This study describes the synthesis of large group of novel 17beta-HSD1 inhibitors based on a non-steroidal thieno[2,3-d]pyrimidin-4(3H)-one core. An efficient synthesis route was developed for the lead compound and subsequently employed in the synthesis of thieno[2,3-d]pyrimidin-4(3H)-one based molecule library. The biological activities and binding of these inhibitors to 17beta-HSD1 and, finally, the quantitative structure activity relationship (QSAR) model are also reported. In this study, several potent and selective 17beta-HSD1 inhibitors without estrogenic activity were identified. This establishment of a novel class of inhibitors is a progressive achievement in 17beta-HSD1 inhibitor development. Furthermore, the 3D-QSAR model, constructed on the basis of this study, offers a powerful tool for future 17beta-HSD1 inhibitor development. As part of the fundamental science underpinning this research, the chemical reactivity of fused (di)cycloalkeno thieno[2,3-d]pyrimidin-4(3H)-ones with electrophilic reagents, i.e. Vilsmeier reagent and dimethylformamide dimethylacetal, was investigated. These findings resulted in a revision of the reaction mechanism of Vilsmeier haloformylation and further contributed to understanding the chemical reactivity of this compound class. This study revealed that the reactivity is dependent upon a stereoelectronic effect arising from different ring conformations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Event-based systems are seen as good candidates for supporting distributed applications in dynamic and ubiquitous environments because they support decoupled and asynchronous many-to-many information dissemination. Event systems are widely used, because asynchronous messaging provides a flexible alternative to RPC (Remote Procedure Call). They are typically implemented using an overlay network of routers. A content-based router forwards event messages based on filters that are installed by subscribers and other routers. The filters are organized into a routing table in order to forward incoming events to proper subscribers and neighbouring routers. This thesis addresses the optimization of content-based routing tables organized using the covering relation and presents novel data structures and configurations for improving local and distributed operation. Data structures are needed for organizing filters into a routing table that supports efficient matching and runtime operation. We present novel results on dynamic filter merging and the integration of filter merging with content-based routing tables. In addition, the thesis examines the cost of client mobility using different protocols and routing topologies. We also present a new matching technique called temporal subspace matching. The technique combines two new features. The first feature, temporal operation, supports notifications, or content profiles, that persist in time. The second feature, subspace matching, allows more expressive semantics, because notifications may contain intervals and be defined as subspaces of the content space. We also present an application of temporal subspace matching pertaining to metadata-based continuous collection and object tracking.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Minimum Description Length (MDL) principle is a general, well-founded theoretical formalization of statistical modeling. The most important notion of MDL is the stochastic complexity, which can be interpreted as the shortest description length of a given sample of data relative to a model class. The exact definition of the stochastic complexity has gone through several evolutionary steps. The latest instantation is based on the so-called Normalized Maximum Likelihood (NML) distribution which has been shown to possess several important theoretical properties. However, the applications of this modern version of the MDL have been quite rare because of computational complexity problems, i.e., for discrete data, the definition of NML involves an exponential sum, and in the case of continuous data, a multi-dimensional integral usually infeasible to evaluate or even approximate accurately. In this doctoral dissertation, we present mathematical techniques for computing NML efficiently for some model families involving discrete data. We also show how these techniques can be used to apply MDL in two practical applications: histogram density estimation and clustering of multi-dimensional data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Analyzing statistical dependencies is a fundamental problem in all empirical science. Dependencies help us understand causes and effects, create new scientific theories, and invent cures to problems. Nowadays, large amounts of data is available, but efficient computational tools for analyzing the data are missing. In this research, we develop efficient algorithms for a commonly occurring search problem - searching for the statistically most significant dependency rules in binary data. We consider dependency rules of the form X->A or X->not A, where X is a set of positive-valued attributes and A is a single attribute. Such rules describe which factors either increase or decrease the probability of the consequent A. A classical example are genetic and environmental factors, which can either cause or prevent a disease. The emphasis in this research is that the discovered dependencies should be genuine - i.e. they should also hold in future data. This is an important distinction from the traditional association rules, which - in spite of their name and a similar appearance to dependency rules - do not necessarily represent statistical dependencies at all or represent only spurious connections, which occur by chance. Therefore, the principal objective is to search for the rules with statistical significance measures. Another important objective is to search for only non-redundant rules, which express the real causes of dependence, without any occasional extra factors. The extra factors do not add any new information on the dependence, but can only blur it and make it less accurate in future data. The problem is computationally very demanding, because the number of all possible rules increases exponentially with the number of attributes. In addition, neither the statistical dependency nor the statistical significance are monotonic properties, which means that the traditional pruning techniques do not work. As a solution, we first derive the mathematical basis for pruning the search space with any well-behaving statistical significance measures. The mathematical theory is complemented by a new algorithmic invention, which enables an efficient search without any heuristic restrictions. The resulting algorithm can be used to search for both positive and negative dependencies with any commonly used statistical measures, like Fisher's exact test, the chi-squared measure, mutual information, and z scores. According to our experiments, the algorithm is well-scalable, especially with Fisher's exact test. It can easily handle even the densest data sets with 10000-20000 attributes. Still, the results are globally optimal, which is a remarkable improvement over the existing solutions. In practice, this means that the user does not have to worry whether the dependencies hold in future data or if the data still contains better, but undiscovered dependencies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In visual object detection and recognition, classifiers have two interesting characteristics: accuracy and speed. Accuracy depends on the complexity of the image features and classifier decision surfaces. Speed depends on the hardware and the computational effort required to use the features and decision surfaces. When attempts to increase accuracy lead to increases in complexity and effort, it is necessary to ask how much are we willing to pay for increased accuracy. For example, if increased computational effort implies quickly diminishing returns in accuracy, then those designing inexpensive surveillance applications cannot aim for maximum accuracy at any cost. It becomes necessary to find trade-offs between accuracy and effort. We study efficient classification of images depicting real-world objects and scenes. Classification is efficient when a classifier can be controlled so that the desired trade-off between accuracy and effort (speed) is achieved and unnecessary computations are avoided on a per input basis. A framework is proposed for understanding and modeling efficient classification of images. Classification is modeled as a tree-like process. In designing the framework, it is important to recognize what is essential and to avoid structures that are narrow in applicability. Earlier frameworks are lacking in this regard. The overall contribution is two-fold. First, the framework is presented, subjected to experiments, and shown to be satisfactory. Second, certain unconventional approaches are experimented with. This allows the separation of the essential from the conventional. To determine if the framework is satisfactory, three categories of questions are identified: trade-off optimization, classifier tree organization, and rules for delegation and confidence modeling. Questions and problems related to each category are addressed and empirical results are presented. For example, related to trade-off optimization, we address the problem of computational bottlenecks that limit the range of trade-offs. We also ask if accuracy versus effort trade-offs can be controlled after training. For another example, regarding classifier tree organization, we first consider the task of organizing a tree in a problem-specific manner. We then ask if problem-specific organization is necessary.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The work covered in this thesis is focused on the development of technology for bioconversion of glucose into D-erythorbic acid (D-EA) and 5-ketogluconic acid (5-KGA). The task was to show on proof-of-concept level the functionality of the enzymatic conversion or one-step bioconversion of glucose to these acids. The feasibility of both studies to be further developed for production processes was also evaluated. The glucose - D-EA bioconversion study was based on the use of a cloned gene encoding a D-EA forming soluble flavoprotein, D-gluconolactone oxidase (GLO). GLO was purified from Penicillium cyaneo-fulvum and partially sequenced. The peptide sequences obtained were used to isolate a cDNA clone encoding the enzyme. The cloned gene (GenBank accession no. AY576053) is homologous to the other known eukaryotic lactone oxidases and also to some putative prokaryotic lactone oxidases. Analysis of the deduced protein sequence of GLO indicated the presence of a typical secretion signal sequence at the N-terminus of the enzyme. No other targeting/anchoring signals were found, suggesting that GLO is the first known lactone oxidase that is secreted rather than targeted to the membranes of the endoplasmic reticulum or mitochondria. Experimental evidence supports this analysis, as near complete secretion of GLO was observed in two different yeast expression systems. Highest expression levels of GLO were obtained using Pichia pastoris as an expression host. Recombinant GLO was characterised and the suitability of purified GLO for the production of D-EA was studied. Immobilised GLO was found to be rapidly inactivated during D-EA production. The feasibility of in vivo glucose - D-EA conversion using a P. pastoris strain co-expressing the genes of GLO and glucose oxidase (GOD, E.C. 1.1.3.4) of A. niger was demonstrated. The glucose - 5-KGA bioconversion study followed a similar strategy to that used in the D-EA production research. The rationale was based on the use of a cloned gene encoding a membrane-bound pyrroloquinoline quinone (PQQ)-dependent gluconate 5-dehydrogenase (GA 5-DH). GA 5-DH was purified to homogeneity from the only source of this enzyme known in literature, Gluconobacter suboxydans, and partially sequenced. Using the amino acid sequence information, the GA 5-DH gene was cloned from a genomic library of G. suboxydans. The cloned gene was sequenced (GenBank accession no. AJ577472) and found to be an operon of two adjacent genes encoding two subunits of GA 5-DH. It turned out that GA 5-DH is a rather close homologue of a sorbitol dehydrogenase from another G. suboxydans strain. It was also found that GA 5-DH has significant polyol dehydrogenase activity. The G. suboxydans GA 5-DH gene was poorly expressed in E. coli. Under optimised conditions maximum expression levels of GA 5-DH did not exceed the levels found in wild-type G. suboxydans. Attempts to increase expression levels resulted in repression of growth and extensive cell lysis. However, the expression levels were sufficient to demonstrate the possibility of bioconversion of glucose and gluconate into 5-KGA using recombinant strains of E. coli. An uncharacterised homologue of GA 5-DH was identified in Xanthomonas campestris using in silico screening. This enzyme encoded by chromosomal locus NP_636946 was found by a sequencing project of X. campestris and named as a hypothetical glucose dehydrogenase. The gene encoding this uncharacterised enzyme was cloned, expressed in E. coli and found to encode a gluconate/polyol dehydrogenase without glucose dehydrogenase activity. Moreover, the X. campestris GA 5-DH gene was expressed in E. coli at nearly 30 times higher levels than the G. suboxydans GA 5-DH gene. Good expressability of the X. campestris GA-5DH gene makes it a valuable tool not only for 5-KGA production in the tartaric acid (TA) bioprocess, but possibly also for other bioprocesses (e.g. oxidation of sorbitol into L-sorbose). In addition to glucose - 5-KGA bioconversion, a preliminary study of the feasibility of enzymatic conversion of 5-KGA into TA was carried out. Here, the efficacy of the first step of a prospective two-step conversion route including a transketolase and a dehydrogenase was confirmed. It was found that transketolase convert 5-KGA into TA semialdehyde. A candidate for the second step was suggested to be succinic dehydrogenase, but this was not tested. The analysis of the two subprojects indicated that bioconversion of glucose to TA using X. campestris GA 5-DH should be prioritised first and the process development efforts in future should be focused on development of more efficient GA 5-DH production strains by screening a more suitable production host and by protein engineering.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The diversity of functions of eukaryotic cells is preserved by enclosing different enzymatic activities into membrane-bound organelles. Separation of exocytic proteins from those which remain in the endoplasmic reticulum (ER) casts the foundation for correct compartmentalization. The secretory pathway, starting from the ER membrane, operates by the aid of cytosolic coat proteins (COPs). In anterograde transport, polymerization of the COPII coat on the ER membrane is essential for the ER exit of proteins. Polymerization of the COPI coatomer on the cis-Golgi membrane functions for the retrieval of proteins from the Golgi for repeated use in the ER. The COPII coat is formed by essential proteins; Sec13/31p and Sec23/24p have been thought to be indispensable for the ER exit of all exocytic proteins. However, we found that functional Sec13p was not required for the ER exit of yeast endogenous glycoprotein Hsp150 in the yeast Saccharomyces cerevisiae. Hsp150 turned out to be an ATP phosphatase. ATP hydrolysis by a Walker motif located in the C-terminal domain of Hsp150 was an active mediator for the Sec13p and Sec24p independent ER exit. Our results suggest that in yeast cells a fast track transport route operates in parallel with the previously described cisternal maturation route of the Golgi. The fast track is used by Hsp150 with the aid of its C-terminal ATPase activity at the ER-exit. Hsp150 is matured with a half time of less than one minute. The cisternal maturation track is several-fold slower and used by other exocytic proteins studied so far. Operative COPI coat is needed for ER exit by a subset of proteins but not by Hsp150. We located a second active determinant to the Hsp150 polypeptide s N-terminal portion that guided also heterologous fusion proteins out of the ER in COPII coated vesicles under non-functional COPI conditions for several hours. Our data indicate that ER exit is a selective, receptor-mediated event, not a bulk flow. Furthermore, it suggests the existence of another retrieval pathway for essential reusable components, besides the COPI-operated retrotransport route. Additional experiments suggest that activation of the COPI primer, ADP ribosylation factor (ARF), is essential also for Hsp150 transport. Moreover, it seemed that a subset of proteins directly needed activated ARF in the anterograde transport to complete the ER exit. Our results indicate that coat structures and transport routes are more variable than it has been imagined.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The ever-increasing demand for faster computers in various areas, ranging from entertaining electronics to computational science, is pushing the semiconductor industry towards its limits on decreasing the sizes of electronic devices based on conventional materials. According to the famous law by Gordon E. Moore, a co-founder of the world s largest semiconductor company Intel, the transistor sizes should decrease to the atomic level during the next few decades to maintain the present rate of increase in the computational power. As leakage currents become a problem for traditional silicon-based devices already at sizes in the nanometer scale, an approach other than further miniaturization is needed to accomplish the needs of the future electronics. A relatively recently proposed possibility for further progress in electronics is to replace silicon with carbon, another element from the same group in the periodic table. Carbon is an especially interesting material for nanometer-sized devices because it forms naturally different nanostructures. Furthermore, some of these structures have unique properties. The most widely suggested allotrope of carbon to be used for electronics is a tubular molecule having an atomic structure resembling that of graphite. These carbon nanotubes are popular both among scientists and in industry because of a wide list of exciting properties. For example, carbon nanotubes are electronically unique and have uncommonly high strength versus mass ratio, which have resulted in a multitude of proposed applications in several fields. In fact, due to some remaining difficulties regarding large-scale production of nanotube-based electronic devices, fields other than electronics have been faster to develop profitable nanotube applications. In this thesis, the possibility of using low-energy ion irradiation to ease the route towards nanotube applications is studied through atomistic simulations on different levels of theory. Specifically, molecular dynamic simulations with analytical interaction models are used to follow the irradiation process of nanotubes to introduce different impurity atoms into these structures, in order to gain control on their electronic character. Ion irradiation is shown to be a very efficient method to replace carbon atoms with boron or nitrogen impurities in single-walled nanotubes. Furthermore, potassium irradiation of multi-walled and fullerene-filled nanotubes is demonstrated to result in small potassium clusters in the hollow parts of these structures. Molecular dynamic simulations are further used to give an example on using irradiation to improve contacts between a nanotube and a silicon substrate. Methods based on the density-functional theory are used to gain insight on the defect structures inevitably created during the irradiation. Finally, a new simulation code utilizing the kinetic Monte Carlo method is introduced to follow the time evolution of irradiation-induced defects on carbon nanotubes on macroscopic time scales. Overall, the molecular dynamic simulations presented in this thesis show that ion irradiation is a promisingmethod for tailoring the nanotube properties in a controlled manner. The calculations made with density-functional-theory based methods indicate that it is energetically favorable for even relatively large defects to transform to keep the atomic configuration as close to the pristine nanotube as possible. The kinetic Monte Carlo studies reveal that elevated temperatures during the processing enhance the self-healing of nanotubes significantly, ensuring low defect concentrations after the treatment with energetic ions. Thereby, nanotubes can retain their desired properties also after the irradiation. Throughout the thesis, atomistic simulations combining different levels of theory are demonstrated to be an important tool for determining the optimal conditions for irradiation experiments, because the atomic-scale processes at short time scales are extremely difficult to study by any other means.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Menneinä vuosikymmeninä maatalouden työt ovat ensin koneellistuneet voimakkaasti ja sittemmin mukaan on tullut automaatio. Nykyään koneiden kokoa suurentamalla ei enää saada tuottavuutta nostettua merkittävästi, vaan työn tehostaminen täytyy tehdä olemassa olevien resurssien käyttöä tehostamalla. Tässä työssä tarkastelun kohteena on ajosilppuriketju nurmisäilörehun korjuussa. Säilörehun korjuun intensiivisyys ja koneyksiköiden runsas määrä ovat työnjohdon kannalta vaativa yhdistelmä. Työn tavoitteena oli selvittää vaatimuksia maatalouden urakoinnin tueksi kehitettävälle tiedonhallintajärjestelmälle. Tutkimusta varten haastateltiin yhteensä 12 urakoitsijaa tai yhteistyötä tekevää viljelijää. Tutkimuksen perusteella urakoitsijoilla on tarvetta tietojärjestelmille.Luonnollisesti urakoinnin laajuus ja järjestelyt vaikuttavat asiaan. Tutkimuksen perusteella keskeisimpiä vaatimuksia tiedonhallinnalle ovat: • mahdollisimman laaja, yksityiskohtainen ja automaattinen tiedon keruu tehtävästä työstä • karttapohjaisuus, kuljettajien opastus kohteisiin • asiakasrekisteri, työn tilaus sähköisesti • tarjouspyyntöpohjat, hintalaskurit • luotettavuus, tiedon säilyvyys • sovellettavuus monenlaisiin töihin • yhteensopivuus muiden järjestelmien kanssa Kehitettävän järjestelmän tulisi siis tutkimuksen perusteella sisältää seuraavia osia: helppokäyttöinen suunnittelu/asiakasrekisterityökalu, toimintoja koneiden seurantaan, opastukseen ja johtamiseen, työnaikainen tiedonkeruu sekä kerätyn tiedon käsittelytoimintoja. Kaikki käyttäjät eivät kuitenkaan tarvitse kaikkia toimintoja, joten urakoitsijan on voitava valita tarvitsemansa osat ja mahdollisesti lisätä toimintoja myöhemmin. Tiukoissa taloudellisissa ja ajallisissa raameissa toimivat urakoitsijat ovat vaativia asiakkaita, joiden käyttämän tekniikan tulee olla toimivaa ja luotettavaa. Toisaalta inhimillisiä virheitä sattuu kokeneillekin, joten hyvällä tietojärjestelmällä työstä tulee helpompaa ja tehokkaampaa.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Pappret conceptualizes parsning med Constraint Grammar på ett nytt sätt som en process med två viktiga representationer. En representation innehåller lokala tvetydighet och den andra sammanfattar egenskaperna hos den lokala tvetydighet klasser. Båda representationer manipuleras med ren finite-state metoder, men deras samtrafik är en ad hoc -tillämpning av rationella potensserier. Den nya tolkningen av parsning systemet har flera praktiska fördelar, bland annat det inåt deterministiska sättet att beräkna, representera och räkna om alla potentiella tillämpningar av reglerna i meningen.