882 resultados para Branch and bound algorithms


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Handedness refers to a consistent asymmetry in skill or preferential use between the hands and is related to lateralization within the brain of other functions such as language. Previous twin studies of handedness have yielded inconsistent results resulting from a general lack of statistical power to find significant effects. Here we present analyses from a large international collaborative study of handedness (assessed by writing/drawing or self report) in Australian and Dutch twins and their siblings (54,270 individuals from 25,732 families). Maximum likelihood analyses incorporating the effects of known covariates (sex, year of birth and birth weight) revealed no evidence of hormonal transfer, mirror imaging or twin specific effects. There were also no differences in prevalence between zygosity groups or between twins and their singleton siblings. Consistent with previous meta-analyses, additive genetic effects accounted for about a quarter (23.64%) of the variance (95%CI 20.17, 27.09%) with the remainder accounted for by non-shared environmental influences. The implications of these findings for handedness both as a primary phenotype and as a covariate in linkage and association analyses are discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Enzymes belonging to the M1 family play important cellular roles and the key amino acids (aa) in the catalytic domain are conserved. However, C-terminal domain aa are highly variable and demonstrate distinct differences in organization. To address a functional role for the C-terminal domain, progressive deletions were generated in Tricorn interacting factor F2 from Thermoplasma acidophilum (F2) and Peptidase N from Escherichia coli (PepN). Catalytic activity was partially reduced in PepN lacking 4 C-terminal residues (PepNΔC4) whereas it was greatly reduced in F2 lacking 10 C-terminal residues (F2ΔC10) or PepN lacking eleven C-terminal residues (PepNΔC11). Notably, expression of PepNΔC4, but not PepNΔC11, in E. coliΔpepN increased its ability to resist nutritional and high temperature stress, demonstrating physiological significance. Purified C-terminal deleted proteins demonstrated greater sensitivity to trypsin and bound stronger to 8-amino 1-napthalene sulphonic acid (ANS), revealing greater numbers of surface exposed hydrophobic aa. Also, F2 or PepN containing large aa deletions in the C-termini, but not smaller deletions, were present in high amounts in the insoluble fraction of cell extracts probably due to reduced protein solubility. Modeling studies, using the crystal structure of E. coli PepN, demonstrated increase in hydrophobic surface area and change in accessibility of several aa from buried to exposed upon deletion of C-terminal aa. Together, these studies revealed that non-conserved distal C-terminal aa repress the surface exposure of apolar aa, enhance protein solubility, and catalytic activity in two soluble and distinct members of the M1 family.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Detect and Avoid (DAA) technology is widely acknowledged as a critical enabler for unsegregated Remote Piloted Aircraft (RPA) operations, particularly Beyond Visual Line of Sight (BVLOS). Image-based DAA, in the visible spectrum, is a promising technological option for addressing the challenges DAA presents. Two impediments to progress for this approach are the scarcity of available video footage to train and test algorithms, in conjunction with testing regimes and specifications which facilitate repeatable, statistically valid, performance assessment. This paper includes three key contributions undertaken to address these impediments. In the first instance, we detail our progress towards the creation of a large hybrid collision and near-collision encounter database. Second, we explore the suitability of techniques employed by the biometric research community (Speaker Verification and Language Identification), for DAA performance optimisation and assessment. These techniques include Detection Error Trade-off (DET) curves, Equal Error Rates (EER), and the Detection Cost Function (DCF). Finally, the hybrid database and the speech-based techniques are combined and employed in the assessment of a contemporary, image based DAA system. This system includes stabilisation, morphological filtering and a Hidden Markov Model (HMM) temporal filter.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Enzymes offer many advantages in industrial processes, such as high specificity, mild treatment conditions and low energy requirements. Therefore, the industry has exploited them in many sectors including food processing. Enzymes can modify food properties by acting on small molecules or on polymers such as carbohydrates or proteins. Crosslinking enzymes such as tyrosinases and sulfhydryl oxidases catalyse the formation of novel covalent bonds between specific residues in proteins and/or peptides, thus forming or modifying the protein network of food. In this study, novel secreted fungal proteins with sequence features typical of tyrosinases and sulfhydryl oxidases were iden-tified through a genome mining study. Representatives of both of these enzyme families were selected for heterologous produc-tion in the filamentous fungus Trichoderma reesei and biochemical characterisation. Firstly, a novel family of putative tyrosinases carrying a shorter sequence than the previously characterised tyrosinases was discovered. These proteins lacked the whole linker and C-terminal domain that possibly play a role in cofactor incorporation, folding or protein activity. One of these proteins, AoCO4 from Aspergillus oryzae, was produced in T. reesei with a production level of about 1.5 g/l. The enzyme AoCO4 was correctly folded and bound the copper cofactors with a type-3 copper centre. However, the enzyme had only a low level of activity with the phenolic substrates tested. Highest activity was obtained with 4-tert-butylcatechol. Since tyrosine was not a substrate for AoCO4, the enzyme was classified as catechol oxidase. Secondly, the genome analysis for secreted proteins with sequence features typical of flavin-dependent sulfhydryl oxidases pinpointed two previously uncharacterised proteins AoSOX1 and AoSOX2 from A. oryzae. These two novel sulfhydryl oxidases were produced in T. reesei with production levels of 70 and 180 mg/l, respectively, in shake flask cultivations. AoSOX1 and AoSOX2 were FAD-dependent enzymes with a dimeric tertiary structure and they both showed activity on small sulfhydryl compounds such as glutathione and dithiothreitol, and were drastically inhibited by zinc sulphate. AoSOX2 showed good stabil-ity to thermal and chemical denaturation, being superior to AoSOX1 in this respect. Thirdly, the suitability of AoSOX1 as a possible baking improver was elucidated. The effect of AoSOX1, alone and in combi-nation with the widely used improver ascorbic acid was tested on yeasted wheat dough, both fresh and frozen, and on fresh water-flour dough. In all cases, AoSOX1 had no effect on the fermentation properties of fresh yeasted dough. AoSOX1 nega-tively affected the fermentation properties of frozen doughs and accelerated the damaging effects of the frozen storage, i.e. giving a softer dough with poorer gas retention abilities than the control. In combination with ascorbic acid, AoSOX1 gave harder doughs. In accordance, rheological studies in yeast-free dough showed that the presence of only AoSOX1 resulted in weaker and more extensible dough whereas a dough with opposite properties was obtained if ascorbic acid was also used. Doughs containing ascorbic acid and increasing amounts of AoSOX1 were harder in a dose-dependent manner. Sulfhydryl oxidase AoSOX1 had an enhancing effect on the dough hardening mechanism of ascorbic acid. This was ascribed mainly to the produc-tion of hydrogen peroxide in the SOX reaction which is able to convert the ascorbic acid to the actual improver dehydroascorbic acid. In addition, AoSOX1 could possibly oxidise the free glutathione in the dough and thus prevent the loss of dough strength caused by the spontaneous reduction of the disulfide bonds constituting the dough protein network. Sulfhydryl oxidase AoSOX1 is therefore able to enhance the action of ascorbic acid in wheat dough and could potentially be applied in wheat dough baking.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The design of a dual-DSP microprocessor system and its application for parallel FFT and two-dimensional convolution are explained. The system is based on a master-salve configuration. Two ADSP-2101s are configured as slave processors and a PC/AT serves as the master. The master serves as a control processor to transfer the program code and data to the DSPs. The system architecture and the algorithms for the two applications, viz. FFT and two-dimensional convolutions, are discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Increasing network lifetime is important in wireless sensor/ad-hoc networks. In this paper, we are concerned with algorithms to increase network lifetime and amount of data delivered during the lifetime by deploying multiple mobile base stations in the sensor network field. Specifically, we allow multiple mobile base stations to be deployed along the periphery of the sensor network field and develop algorithms to dynamically choose the locations of these base stations so as to improve network lifetime. We propose energy efficient low-complexity algorithms to determine the locations of the base stations; they include i) Top-K-max algorithm, ii) maximizing the minimum residual energy (Max-Min-RE) algorithm, and iii) minimizing the residual energy difference (MinDiff-RE) algorithm. We show that the proposed base stations placement algorithms provide increased network lifetimes and amount of data delivered during the network lifetime compared to single base station scenario as well as multiple static base stations scenario, and close to those obtained by solving an integer linear program (ILP) to determine the locations of the mobile base stations. We also investigate the lifetime gain when an energy aware routing protocol is employed along with multiple base stations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Let A and B be two objects. We define measures to characterize the penetration of A and B when A boolean AND B not equal 0. We then present properties of the measures and efficient algorithms to compute them for planar and polyhedral objects. We explore applications of the measures and present some experimental results.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

2-Methylcitric acid (2-MCA) cycle is one of the well studied pathways for the utilization of propionate as a source of carbon and energy in bacteria such as Salmonella typhimurium and Escherichia coli. 2-Methylcitrate synthase (2-MCS) catalyzes the conversion of oxaloacetate and propionyl-CoA to 2-methylcitrate and CoA in the second step of 2-MCA cycle. Here, we report the X-ray crystal structure of S. typhimurium 2-MCS (StPrpC) at 2.4 A resolution and its functional characterization. StPrpC was found to utilize propionyl-CoA more efficiently than acetyl-CoA or butyryl-CoA. The polypeptide fold and the catalytic residues of StPrpC are conserved in citrate synthases (CSs) suggesting similarities in their functional mechanisms. In the triclinic P1 cell, StPrpC molecules were organized as decamers composed of five identical dimer units. In solution, StPrpC was in a dimeric form at low concentrations and was converted to larger oligomers at higher concentrations. CSs are usually dimeric proteins. In Gram-negative bacteria, a hexameric form, believed to be important for regulation of activity by NADH, is also observed. Structural comparisons with hexameric E. coil CS suggested that the key residues involved in NADH binding are not conserved in StPrpC. Structural comparison with the ligand free and bound states of CSs showed that StPrpC is in a nearly closed conformation despite the absence of bound ligands. It was found that the Tyr197 and Leu324 of StPrpC are structurally equivalent to the ligand binding residues His and Val, respectively, of CSs. These substitutions might determine the specificities for acyl-CoAs of these enzymes. (C) 2010 Elsevier Inc. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Measured health signals incorporate significant details about any malfunction in a gas turbine. The attenuation of noise and removal of outliers from these health signals while preserving important features is an important problem in gas turbine diagnostics. The measured health signals are a time series of sensor measurements such as the low rotor speed, high rotor speed, fuel flow, and exhaust gas temperature in a gas turbine. In this article, a comparative study is done by varying the window length of acausal and unsymmetrical weighted recursive median filters and numerical results for error minimization are obtained. It is found that optimal filters exist, which can be used for engines where data are available slowly (three-point filter) and rapidly (seven-point filter). These smoothing filters are proposed as preprocessors of measurement delta signals before subjecting them to fault detection and isolation algorithms.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The questions that one should answer in engineering computations - deterministic, probabilistic/randomized, as well as heuristic - are (i) how good the computed results/outputs are and (ii) how much the cost in terms of amount of computation and the amount of storage utilized in getting the outputs is. The absolutely errorfree quantities as well as the completely errorless computations done in a natural process can never be captured by any means that we have at our disposal. While the computations including the input real quantities in nature/natural processes are exact, all the computations that we do using a digital computer or are carried out in an embedded form are never exact. The input data for such computations are also never exact because any measuring instrument has inherent error of a fixed order associated with it and this error, as a matter of hypothesis and not as a matter of assumption, is not less than 0.005 per cent. Here by error we imply relative error bounds. The fact that exact error is never known under any circumstances and any context implies that the term error is nothing but error-bounds. Further, in engineering computations, it is the relative error or, equivalently, the relative error-bounds (and not the absolute error) which is supremely important in providing us the information regarding the quality of the results/outputs. Another important fact is that inconsistency and/or near-consistency in nature, i.e., in problems created from nature is completely nonexistent while in our modelling of the natural problems we may introduce inconsistency or near-inconsistency due to human error or due to inherent non-removable error associated with any measuring device or due to assumptions introduced to make the problem solvable or more easily solvable in practice. Thus if we discover any inconsistency or possibly any near-inconsistency in a mathematical model, it is certainly due to any or all of the three foregoing factors. We do, however, go ahead to solve such inconsistent/near-consistent problems and do get results that could be useful in real-world situations. The talk considers several deterministic, probabilistic, and heuristic algorithms in numerical optimisation, other numerical and statistical computations, and in PAC (probably approximately correct) learning models. It highlights the quality of the results/outputs through specifying relative error-bounds along with the associated confidence level, and the cost, viz., amount of computations and that of storage through complexity. It points out the limitation in error-free computations (wherever possible, i.e., where the number of arithmetic operations is finite and is known a priori) as well as in the usage of interval arithmetic. Further, the interdependence among the error, the confidence, and the cost is discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Obtaining correctly folded proteins from inclusion bodies of recombinant proteins expressed in bacterial hosts requires solubilization with denaturants and a refolding step. Aggregation competes with the second step. Refolding of eight different proteins was carried out by precipitation with smart polymers. These proteins have different molecular weights, different number of disulfide bridges and some of these are known to be highly prone to aggregation. A high throughput refolding screen based upon fluorescence emission maximum around 340 nm (for correctly folded proteins) was developed to identify the suitable smart polymer. The proteins could be dissociated and recovered after the refolding step. The refolding could be scaled up and high refolding yields in the range of 8 mg L-1 (for CD4D12, the first two domains of human CD4) to 58 mg L-1 (for malETrx, thioredoxin fused with signal peptide of maltose binding protein) were obtained. Dynamic light scattering (DLS) showed that polymer if chosen correctly acted as a pseuclochaperonin and bound to the proteins. It also showed that the time for maximum binding was about 50 min which coincided with the time required for incubation (with the polymer) before precipitation for maximum recovery of folded proteins. The refolded proteins were characterized by fluorescence emission spectra, circular dichroism (CD) spectroscopy, melting temperature (T-m), and surface hydrophobicity measurement by ANS (8-anilinol-naphthalene sulfonic acid) fluorescence. Biological activity assay for thioredoxin and fluorescence based assay in case of maltose binding protein (MBP) were also carried out to confirm correct refolding. (C) 2012 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Lovasz θ function of a graph, is a fundamental tool in combinatorial optimization and approximation algorithms. Computing θ involves solving a SDP and is extremely expensive even for moderately sized graphs. In this paper we establish that the Lovasz θ function is equivalent to a kernel learning problem related to one class SVM. This interesting connection opens up many opportunities bridging graph theoretic algorithms and machine learning. We show that there exist graphs, which we call SVM−θ graphs, on which the Lovasz θ function can be approximated well by a one-class SVM. This leads to a novel use of SVM techniques to solve algorithmic problems in large graphs e.g. identifying a planted clique of size Θ(n√) in a random graph G(n,12). A classic approach for this problem involves computing the θ function, however it is not scalable due to SDP computation. We show that the random graph with a planted clique is an example of SVM−θ graph, and as a consequence a SVM based approach easily identifies the clique in large graphs and is competitive with the state-of-the-art. Further, we introduce the notion of a ''common orthogonal labeling'' which extends the notion of a ''orthogonal labelling of a single graph (used in defining the θ function) to multiple graphs. The problem of finding the optimal common orthogonal labelling is cast as a Multiple Kernel Learning problem and is used to identify a large common dense region in multiple graphs. The proposed algorithm achieves an order of magnitude scalability compared to the state of the art.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Content Distribution Networks (CDNs) are widely used to distribute data to large number of users. Traditionally, content is being replicated among a number of surrogate servers, leading to high operational costs. In this context, Peer-to-Peer (P2P) CDNs have emerged as a viable alternative. An issue of concern in P2P networks is that of free riders, i.e., selfish peers who download files and leave without uploading anything in return. Free riding must be discouraged. In this paper, we propose a criterion, the Give-and-Take (G&T) criterion, that disallows free riders. Incorporating the G&T criterion in our model, we study a problem that arises naturally when a new peer enters the system: viz., the problem of downloading a `universe' of segments, scattered among other peers, at low cost. We analyse this hard problem, and characterize the optimal download cost under the G&T criterion. We propose an optimal algorithm, and provide a sub-optimal algorithm that is nearly optimal, but runs much more quickly; this provides an attractive balance between running time and performance. Finally, we compare the performance of our algorithms with that of a few existing P2P downloading strategies in use. We also study the computation time for prescribing the strategy for initial segment and peer selection for the newly arrived peer for various existing and proposed algorithms, and quantify cost-computation time trade-offs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Here, we show the binding results of a leguminosae lectin, winged bean basic agglutinin (WBA I) to N-trifluoroacetylgalactosamine (NTFAGalN), methyl-alpha-N-trifluoroacetylgalactosamine (Me alpha NTFAGalN) and methyl-beta-tifluoroacetylgalactosamine (Me beta NTFAGalN) using (19) F NMR spectroscopy. No chemical shift difference between the free and bound states for NTFAGalN and Me beta NTFAGalN, and 0.01-ppm chemical shift change for Me alpha NTFAGalN, demonstrate that the Me alpha NTFAGalN has a sufficiently long residence time on the protein binding site as compared to Me beta NTFAGalN and the free anomers of NTFAGalN. The sugar anomers were found in slow exchange with the binding site of agglutinin. Consequently, we obtained their binding parameters to the protein using line shape analyses. Aforementioned analyses of the activation parameters for the interactions of these saccharides indicate that the binding of alpha and beta anomers of NTFAGalN and Me alpha NTFAGalN is controlled enthalpically, while that of Me beta NTFAGalN is controlled entropically. This asserts the sterically constrained nature of the interaction of the Me beta NTFAGalN with WBA I. These studies thus highlight a significant role of the conformation of the monosaccharide ligands for their recognition by WBA I.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We show that in studies of light quark- and gluon-initiated jet discrimination, it is important to include the information on softer reconstructed jets (associated jets) around a primary hard jet. This is particularly relevant while adopting a small radius parameter for reconstructing hadronic jets. The probability of having an associated jet as a function of the primary jet transverse momentum (PT) and radius, the minimum associated jet pi, and the association radius is computed up to next-to-double logarithmic accuracy (NDLA), and the predictions are compared with results from Herwig++, Pythia6 and Pythia8 Monte Carlos (MC). We demonstrate the improvement in quark-gluon discrimination on using the associated jet rate variable with the help of a multivariate analysis. The associated jet rates are found to be only mildly sensitive to the choice of parton shower and hadronization algorithms, as well as to the effects of initial state radiation and underlying event. In addition, the number of k(t) subjets of an anti-k(t) jet is found to be an observable that leads to a rather uniform prediction across different MC's, broadly being in agreement with predictions in NDLA, as compared to the often used number of charged tracks observable.