882 resultados para Branch and bound algorithms
Resumo:
Motivated by privacy issues associated with dissemination of signed digital certificates, we define a new type of signature scheme called a ‘Universal Designated-Verifier Signature’ (UDVS). A UDVS scheme can function as a standard publicly-verifiable digital signature but has additional functionality which allows any holder of a signature (not necessarily the signer) to designate the signature to any desired designated-verifier (using the verifier’s public key). Given the designated-signature, the designated-verifier can verify that the message was signed by the signer, but is unable to convince anyone else of this fact. We propose an efficient deterministic UDVS scheme constructed using any bilinear group-pair. Our UDVS scheme functions as a standard Boneh-Lynn-Shacham (BLS) signature when no verifier-designation is performed, and is therefore compatible with the key-generation, signing and verifying algorithms of the BLS scheme. We prove that our UDVS scheme is secure in the sense of our unforgeability and privacy notions for UDVS schemes, under the Bilinear Diffie-Hellman (BDH) assumption for the underlying group-pair, in the random-oracle model. We also demonstrate a general constructive equivalence between a class of unforgeable and unconditionally-private UDVS schemes having unique signatures (which includes the deterministic UDVS schemes) and a class of ID-Based Encryption (IBE) schemes which contains the Boneh-Franklin IBE scheme but not the Cocks IBE scheme.
Resumo:
Mobile robots and animals alike must effectively navigate their environments in order to achieve their goals. For animals goal-directed navigation facilitates finding food, seeking shelter or migration; similarly robots perform goal-directed navigation to find a charging station, get out of the rain or guide a person to a destination. This similarity in tasks extends to the environment as well; increasingly, mobile robots are operating in the same underwater, ground and aerial environments that animals do. Yet despite these similarities, goal-directed navigation research in robotics and biology has proceeded largely in parallel, linked only by a small amount of interdisciplinary research spanning both areas. Most state-of-the-art robotic navigation systems employ a range of sensors, world representations and navigation algorithms that seem far removed from what we know of how animals navigate; their navigation systems are shaped by key principles of navigation in ‘real-world’ environments including dealing with uncertainty in sensing, landmark observation and world modelling. By contrast, biomimetic animal navigation models produce plausible animal navigation behaviour in a range of laboratory experimental navigation paradigms, typically without addressing many of these robotic navigation principles. In this paper, we attempt to link robotics and biology by reviewing the current state of the art in conventional and biomimetic goal-directed navigation models, focusing on the key principles of goal-oriented robotic navigation and the extent to which these principles have been adapted by biomimetic navigation models and why.
Resumo:
Problem addressed Wrist-worn accelerometers are associated with greater compliance. However, validated algorithms for predicting activity type from wrist-worn accelerometer data are lacking. This study compared the activity recognition rates of an activity classifier trained on acceleration signal collected on the wrist and hip. Methodology 52 children and adolescents (mean age 13.7 +/- 3.1 year) completed 12 activity trials that were categorized into 7 activity classes: lying down, sitting, standing, walking, running, basketball, and dancing. During each trial, participants wore an ActiGraph GT3X+ tri-axial accelerometer on the right hip and the non-dominant wrist. Features were extracted from 10-s windows and inputted into a regularized logistic regression model using R (Glmnet + L1). Results Classification accuracy for the hip and wrist was 91.0% +/- 3.1% and 88.4% +/- 3.0%, respectively. The hip model exhibited excellent classification accuracy for sitting (91.3%), standing (95.8%), walking (95.8%), and running (96.8%); acceptable classification accuracy for lying down (88.3%) and basketball (81.9%); and modest accuracy for dance (64.1%). The wrist model exhibited excellent classification accuracy for sitting (93.0%), standing (91.7%), and walking (95.8%); acceptable classification accuracy for basketball (86.0%); and modest accuracy for running (78.8%), lying down (74.6%) and dance (69.4%). Potential Impact Both the hip and wrist algorithms achieved acceptable classification accuracy, allowing researchers to use either placement for activity recognition.
Resumo:
The forthcoming NIST’s Advanced Hash Standard (AHS) competition to select SHA-3 hash function requires that each candidate hash function submission must have at least one construction to support FIPS 198 HMAC application. As part of its evaluation, NIST is aiming to select either a candidate hash function which is more resistant to known side channel attacks (SCA) when plugged into HMAC, or that has an alternative MAC mode which is more resistant to known SCA than the other submitted alternatives. In response to this, we perform differential power analysis (DPA) on the possible smart card implementations of some of the recently proposed MAC alternatives to NMAC (a fully analyzed variant of HMAC) and HMAC algorithms and NMAC/HMAC versions of some recently proposed hash and compression function modes. We show that the recently proposed BNMAC and KMDP MAC schemes are even weaker than NMAC/HMAC against the DPA attacks, whereas multi-lane NMAC, EMD MAC and the keyed wide-pipe hash have similar security to NMAC against the DPA attacks. Our DPA attacks do not work on the NMAC setting of MDC-2, Grindahl and MAME compression functions.
Resumo:
A number of online algorithms have been developed that have small additional loss (regret) compared to the best “shifting expert”. In this model, there is a set of experts and the comparator is the best partition of the trial sequence into a small number of segments, where the expert of smallest loss is chosen in each segment. The regret is typically defined for worst-case data / loss sequences. There has been a recent surge of interest in online algorithms that combine good worst-case guarantees with much improved performance on easy data. A practically relevant class of easy data is the case when the loss of each expert is iid and the best and second best experts have a gap between their mean loss. In the full information setting, the FlipFlop algorithm by De Rooij et al. (2014) combines the best of the iid optimal Follow-The-Leader (FL) and the worst-case-safe Hedge algorithms, whereas in the bandit information case SAO by Bubeck and Slivkins (2012) competes with the iid optimal UCB and the worst-case-safe EXP3. We ask the same question for the shifting expert problem. First, we ask what are the simple and efficient algorithms for the shifting experts problem when the loss sequence in each segment is iid with respect to a fixed but unknown distribution. Second, we ask how to efficiently unite the performance of such algorithms on easy data with worst-case robustness. A particular intriguing open problem is the case when the comparator shifts within a small subset of experts from a large set under the assumption that the losses in each segment are iid.
Resumo:
The proliferation of the web presents an unsolved problem of automatically analyzing billions of pages of natural language. We introduce a scalable algorithm that clusters hundreds of millions of web pages into hundreds of thousands of clusters. It does this on a single mid-range machine using efficient algorithms and compressed document representations. It is applied to two web-scale crawls covering tens of terabytes. ClueWeb09 and ClueWeb12 contain 500 and 733 million web pages and were clustered into 500,000 to 700,000 clusters. To the best of our knowledge, such fine grained clustering has not been previously demonstrated. Previous approaches clustered a sample that limits the maximum number of discoverable clusters. The proposed EM-tree algorithm uses the entire collection in clustering and produces several orders of magnitude more clusters than the existing algorithms. Fine grained clustering is necessary for meaningful clustering in massive collections where the number of distinct topics grows linearly with collection size. These fine-grained clusters show an improved cluster quality when assessed with two novel evaluations using ad hoc search relevance judgments and spam classifications for external validation. These evaluations solve the problem of assessing the quality of clusters where categorical labeling is unavailable and unfeasible.
Resumo:
This project constructed virtual plant leaf surfaces from digitised data sets for use in droplet spray models. Digitisation techniques for obtaining data sets for cotton, chenopodium and wheat leaves are discussed and novel algorithms for the reconstruction of the leaves from these three plant species are developed. The reconstructed leaf surfaces are included into agricultural droplet spray models to investigate the effect of the nozzle and spray formulation combination on the proportion of spray retained by the plant. A numerical study of the post-impaction motion of large droplets that have formed on the leaf surface is also considered.
Resumo:
In the past few years, the virtual machine (VM) placement problem has been studied intensively and many algorithms for the VM placement problem have been proposed. However, those proposed VM placement algorithms have not been widely used in today's cloud data centers as they do not consider the migration cost from current VM placement to the new optimal VM placement. As a result, the gain from optimizing VM placement may be less than the loss of the migration cost from current VM placement to the new VM placement. To address this issue, this paper presents a penalty-based genetic algorithm (GA) for the VM placement problem that considers the migration cost in addition to the energy-consumption of the new VM placement and the total inter-VM traffic flow in the new VM placement. The GA has been implemented and evaluated by experiments, and the experimental results show that the GA outperforms two well known algorithms for the VM placement problem.
Resumo:
Network topology and routing are two important factors in determining the communication costs of big data applications at large scale. As for a given Cluster, Cloud, or Grid system, the network topology is fixed and static or dynamic routing protocols are preinstalled to direct the network traffic. Users cannot change them once the system is deployed. Hence, it is hard for application developers to identify the optimal network topology and routing algorithm for their applications with distinct communication patterns. In this study, we design a CCG virtual system (CCGVS), which first uses container-based virtualization to allow users to create a farm of lightweight virtual machines on a single host. Then, it uses software-defined networking (SDN) technique to control the network traffic among these virtual machines. Users can change the network topology and control the network traffic programmingly, thereby enabling application developers to evaluate their applications on the same system with different network topologies and routing algorithms. The preliminary experimental results through both synthetic big data programs and NPB benchmarks have shown that CCGVS can represent application performance variations caused by network topology and routing algorithm.
Resumo:
Background Schizophrenia is associated with lower pre-morbid intelligence (IQ) in addition to (pre-morbid) cognitive decline. Both schizophrenia and IQ are highly heritable traits. Therefore, we hypothesized that genetic variants associated with schizophrenia, including copy number variants (CNVs) and a polygenic schizophrenia (risk) score (PSS), may influence intelligence. Method IQ was estimated with the Wechsler Adult Intelligence Scale (WAIS). CNVs were determined from single nucleotide polymorphism (SNP) data using the QuantiSNP and PennCNV algorithms. For the PSS, odds ratios for genome-wide SNP data were calculated in a sample collected by the Psychiatric Genome-Wide Association Study (GWAS) Consortium (8690 schizophrenia patients and 11 831 controls). These were used to calculate individual PSSs in our independent sample of 350 schizophrenia patients and 322 healthy controls. Results Although significantly more genes were disrupted by deletions in schizophrenia patients compared to controls (p = 0.009), there was no effect of CNV measures on IQ. The PSS was associated with disease status (R 2 = 0.055, p = 2.1 × 10 -7) and with IQ in the entire sample (R 2 = 0.018, p = 0.0008) but the effect on IQ disappeared after correction for disease status. Conclusions Our data suggest that rare and common schizophrenia-associated variants do not explain the variation in IQ in healthy subjects or in schizophrenia patients. Thus, reductions in IQ in schizophrenia patients may be secondary to other processes related to schizophrenia risk. © Cambridge University Press 2013.
Resumo:
Natural products constitute an important source of new drugs. The bioavailability of the drugs depends on their absorption, distribution, metabolism and elimination. To achieve good bioavailability, the drug must be soluble in water, stable in the gastrointestinal tract and palatable. Binding proteins may improve the solubility of drug compounds, masking unwanted properties, such as bad taste, bitterness or toxicity, transporting or protecting these compounds during processing and storage. The focus of this thesis was to study the interactions, including ligand binding and the effect of pH and temperature, of bovine and reindeer β-lactoglobulin (βLG) with such compounds as retinoids, phenolic compounds as well as with compounds from plant extracts, and to investigate the transport properties of the βLG-ligand complex. To examine the binding interactions of different ligands to βLG, new methods were developed. The fluorescence binding method for the evaluation of ligand binding to βLG was miniaturized from a quartz cell to a 96-well plate. A method of ultrafiltration sampling combined with high-performance liquid chromatography was developed to assess the binding of compounds from extracts. The interactions of phenolic compounds or retinoids and βLG were investigated using the 96-well plate method. The majority of flavones, flavonols, flavanones and isoflavones and all of the retinoids included were shown to bind to bovine and reindeer βLG. Phenolic compounds, contrary to retinol, were not released at acidic pH. Those results suggest that βLG may have more binding sites, probably also on the surface of βLG. An extract from Camellia sinensis (L.) O. Kunze (black tea), Urtica dioica L. (nettle) and Piper nigrum (black pepper) were used to evaluate whether βLG could bind compounds from plant extracts. Piperine from P. nigrum was found to bind tightly and rutin from U. dioica weakly to βLG. No components from C. sinensis bound to βLG in our experiment. The uptake and membrane permeation of bovine and reindeer βLG, free and bound with retinol, palmitic acid and cholesterol, were investigated using Caco-2 cell monolayers. Both bovine and reindeer βLG were able to cross the Caco-2 cell membrane. Free and βLG-bound retinol and palmitic acid were transported equally, whereas cholesterol could not cross the Caco-2 cell monolayer free or bound to βLG. Our results showed that βLG can bind different natural product compounds, but cannot enhance transport of retinol, palmitic acid or cholesterol through Caco-2 cells. Despite this, βLG, as a water-soluble binding protein, may improve the solubility of natural compounds, possibly protecting them from early degradation and transporting some of them through the stomach. Furthermore, it may decrease their bad or bitter taste during oral administration of drugs or in food preparations. βLG can also enhance or decrease the health benefits of herbal teas and food preparations by binding compounds from extracts.
Resumo:
Replacement of deteriorated water pipes is a capital-intensive activity for utility companies. Replacement planning aims to minimize total costs while maintaining a satisfactory level of service and is usually conducted for individual pipes. Scheduling replacement in groups is seen to be a better method and has the potential to provide benefits such as the reduction of maintenance costs and service interruptions. However, developing group replacement schedules is a complex task and often beyond the ability of a human expert, especially when multiple or conflicting objectives need to be catered for, such as minimization of total costs and service interruptions. This paper describes the development of a novel replacement decision optimization model for group scheduling (RDOM-GS), which enables multiple group-scheduling criteria by integrating new cost functions, a service interruption model, and optimization algorithms into a unified procedure. An industry case study demonstrates that RDOM-GS can improve replacement planning significantly and reduce costs and service interruptions.
Resumo:
SNPs discovered by genome-wide association studies (GWASs) account for only a small fraction of the genetic variation of complex traits in human populations. Where is the remaining heritability? We estimated the proportion of variance for human height explained by 294,831 SNPs genotyped on 3,925 unrelated individuals using a linear model analysis, and validated the estimation method with simulations based on the observed genotype data. We show that 45% of variance can be explained by considering all SNPs simultaneously. Thus, most of the heritability is not missing but has not previously been detected because the individual effects are too small to pass stringent significance tests. We provide evidence that the remaining heritability is due to incomplete linkage disequilibrium between causal variants and genotyped SNPs, exacerbated by causal variants having lower minor allele frequency than the SNPs explored to date.
Resumo:
An important challenge in forest industry is to get the appropriate raw material out from the forests to the wood processing industry. Growth and stem reconstruction simulators are therefore increasingly integrated in industrial conversion simulators, for linking the properties of wooden products to the three-dimensional structure of stems and their growing conditions. Static simulators predict the wood properties from stem dimensions at the end of a growth simulation period, whereas in dynamic approaches, the structural components, e.g. branches, are incremented along with the growth processes. The dynamic approach can be applied to stem reconstruction by predicting the three-dimensional stem structure from external tree variables (i.e. age, height) as a result of growth to the current state. In this study, a dynamic growth simulator, PipeQual, and a stem reconstruction simulator, RetroSTEM, are adapted to Norway spruce (Picea abies [L.] Karst.) to predict the three-dimensional structure of stems (tapers, branchiness, wood basic density) over time such that both simulators can be integrated in a sawing simulator. The parameterisation of the PipeQual and RetroSTEM simulators for Norway spruce relied on the theoretically based description of tree structure developing in the growth process and following certain conservative structural regularities while allowing for plasticity in the crown development. The crown expressed both regularity and plasticity in its development, as the vertical foliage density peaked regularly at about 5 m from the stem apex, varying below that with tree age and dominance position (Study I). Conservative stem structure was characterized in terms of (1) the pipe ratios between foliage mass and branch and stem cross-sectional areas at crown base, (2) the allometric relationship between foliage mass and crown length, (3) mean branch length relative to crown length and (4) form coefficients in branches and stem (Study II). The pipe ratio between branch and stem cross-sectional area at crown base, and mean branch length relative to the crown length may differ in trees before and after canopy closure, but the variation should be further analysed in stands of different ages and densities with varying site fertilities and climates. The predictions of the PipeQual and RetroSTEM simulators were evaluated by comparing the simulated values to measured ones (Study III, IV). Both simulators predicted stem taper and branch diameter at the individual tree level with a small bias. RetroSTEM predictions of wood density were accurate. For focusing on even more accurate predictions of stem diameters and branchiness along the stem, both simulators should be further improved by revising the following aspects in the simulators: the relationship between foliage and stem sapwood area in the upper stem, the error source in branch sizes, the crown base development and the height growth models in RetroSTEM. In Study V, the RetroSTEM simulator was integrated in the InnoSIM sawing simulator, and according to the pilot simulations, this turned out to be an efficient tool for readily producing stand scale information about stem sizes and structure when approximating the available assortments of wood products.
Resumo:
This thesis integrates real-time feedback control into an optical tweezers instrument. The goal is to reduce the variance in the trapped bead s position, -effectively increasing the trap stiffness of the optical tweezers. Trap steering is done with acousto-optic deflectors and control algorithms are implemented with a field-programmable gate array card. When position clamp feedback control is on, the effective trap stiffness increases 12.1-times compared to the stiffness without control. This allows improved spatial control over trapped particles without increasing the trapping laser power.