991 resultados para Exhaustive search


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The SNP-SNP interactome has rarely been explored in the context of neuroimaging genetics mainly due to the complexity of conducting approximately 10(11) pairwise statistical tests. However, recent advances in machine learning, specifically the iterative sure independence screening (SIS) method, have enabled the analysis of datasets where the number of predictors is much larger than the number of observations. Using an implementation of the SIS algorithm (called EPISIS), we used exhaustive search of the genome-wide, SNP-SNP interactome to identify and prioritize SNPs for interaction analysis. We identified a significant SNP pair, rs1345203 and rs1213205, associated with temporal lobe volume. We further examined the full-brain, voxelwise effects of the interaction in the ADNI dataset and separately in an independent dataset of healthy twins (QTIM). We found that each additional loading in the epistatic effect was associated with approximately 5% greater brain regional brain volume (a protective effect) in both the ADNI and QTIM samples.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents the development of a new building physics and energy supply systems simulation platform. It has been adapted from both existing commercial models and empirical works, but designed to provide expedient exhaustive simulation of all salient types of energy- and carbon-reducing retrofit options. These options may include any combination of behavioural measures, building fabric and equipment upgrades, improved HVAC control strategies, or novel low-carbon energy supply technologies. We provide a methodological description of the proposed model, followed by two illustrative case studies of the tool when used to investigate retrofit options of a mixed-use office building and primary school in the UK. It is not the intention of this paper, nor would it be feasible, to provide a complete engineering decomposition of the proposed model, describing all calculation processes in detail. Instead, this paper concentrates on presenting the particular engineering aspects of the model which steer away from conventional practise. © 2011 Elsevier Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

* The work is supported by RFBR, grant 04-01-00858-a.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Combinatorial testing is an important testing method. It requires the test cases to cover various combinations of parameters of the system under test. The test generation problem for combinatorial testing can be modeled as constructing a matrix which has certain properties. This paper first discusses two combinatorial testing criteria: covering array and orthogonal array, and then proposes a backtracking search algorithm to construct matrices satisfying them. Several search heuristics and symmetry breaking techniques are used to reduce the search time. This paper also introduces some techniques to generate large covering array instances from smaller ones. All the techniques have been implemented in a tool called EXACT (EXhaustive seArch of Combinatorial Test suites). A new optimal covering array is found by this tool.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Self-dual doubly even linear binary error-correcting codes, often referred to as Type II codes, are codes closely related to many combinatorial structures such as 5-designs. Extremal codes are codes that have the largest possible minimum distance for a given length and dimension. The existence of an extremal (72,36,16) Type II code is still open. Previous results show that the automorphism group of a putative code C with the aforementioned properties has order 5 or dividing 24. In this work, we present a method and the results of an exhaustive search showing that such a code C cannot admit an automorphism group Z6. In addition, we present so far unpublished construction of the extended Golay code by P. Becker. We generalize the notion and provide example of another Type II code that can be obtained in this fashion. Consequently, we relate Becker's construction to the construction of binary Type II codes from codes over GF(2^r) via the Gray map.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Optimum subwindow search for object detection aims to find a subwindow so that the contained subimage is most similar to the query object. This problem can be formulated as a four dimensional (4D) maximum entry search problem wherein each entry corresponds to the quality score of the subimage contained in a subwindow. For n x n images, a naive exhaustive search requires O(n4) sequential computations of the quality scores for all subwindows. To reduce the time complexity, we prove that, for some typical similarity functions like Euclidian metric, χ2 metric on image histograms, the associated 4D array carries some Monge structures and we utilise these properties to speed up the optimum subwindow search and the time complexity is reduced to O(n3). Furthermore, we propose a locally optimal alternating column and row search method with typical quadratic time complexity O(n2). Experiments on PASCAL VOC 2006 demonstrate that the alternating method is significantly faster than the well known efficient subwindow search (ESS) method whilst the performance loss due to local maxima problem is negligible.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

In this paper we propose a nature-inspired approach that can boost the Optimum-Path Forest (OPF) clustering algorithm by optimizing its parameters in a discrete lattice. The experiments in two public datasets have shown that the proposed algorithm can achieve similar parameters' values compared to the exhaustive search. Although, the proposed technique is faster than the traditional one, being interesting for intrusion detection in large scale traffic networks. © 2012 IEEE.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Feature selection aims to find the most important information from a given set of features. As this task can be seen as an optimization problem, the combinatorial growth of the possible solutions may be inviable for a exhaustive search. In this paper we propose a new nature-inspired feature selection technique based on the Charged System Search (CSS), which has never been applied to this context so far. The wrapper approach combines the power of exploration of CSS together with the speed of the Optimum-Path Forest classifier to find the set of features that maximizes the accuracy in a validating set. Experiments conducted in four public datasets have demonstrated the validity of the proposed approach can outperform some well-known swarm-based techniques. © 2013 Springer-Verlag.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

LEX is a stream cipher that progressed to Phase 3 of the eSTREAM stream cipher project. In this paper, we show that the security of LEX against algebraic attacks relies on a small equation system not being solvable faster than exhaustive search. We use the byte leakage in LEX to construct a system of 21 equa- tions in 17 variables. This is very close to the require- ment for an efficient attack, i.e. a system containing 16 variables. The system requires only 36 bytes of keystream, which is very low.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Purpose of study: Traffic conflicts occur when trains on different routes approach a converging junction in a railway network at the same time. To prevent collisions, a right-of-way assignment is needed to control the order in which the trains should pass the junction. Such control action inevitably requires the braking and/or stopping of trains, which lengthens their travelling times and leads to delays. Train delays cause a loss of punctuality and hence directly affect the quality of service. It is therefore important to minimise the delays by devising a suitable right-of-way assignment. One of the major difficulties in attaining the optimal right-of-way assignment is that the number of feasible assignments increases dramatically with the number of trains. Connected-junctions further complicate the problem. Exhaustive search for the optimal solution is time-consuming and infeasible for area control (multi-junction). Even with the more intelligent deterministic optimisation method revealed in [1], the computation demand is still considerable, which hinders real-time control. In practice, as suggested in [2], the optimality may be traded off by shorter computation time, and heuristic searches provide alternatives for this optimisation problem.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Determination of the placement and rating of transformers and feeders are the main objective of the basic distribution network planning. The bus voltage and the feeder current are two constraints which should be maintained within their standard range. The distribution network planning is hardened when the planning area is located far from the sources of power generation and the infrastructure. This is mainly as a consequence of the voltage drop, line loss and system reliability. Long distance to supply loads causes a significant amount of voltage drop across the distribution lines. Capacitors and Voltage Regulators (VRs) can be installed to decrease the voltage drop. This long distance also increases the probability of occurrence of a failure. This high probability leads the network reliability to be low. Cross-Connections (CC) and Distributed Generators (DGs) are devices which can be employed for improving system reliability. Another main factor which should be considered in planning of distribution networks (in both rural and urban areas) is load growth. For supporting this factor, transformers and feeders are conventionally upgraded which applies a large cost. Installation of DGs and capacitors in a distribution network can alleviate this issue while the other benefits are gained. In this research, a comprehensive planning is presented for the distribution networks. Since the distribution network is composed of low and medium voltage networks, both are included in this procedure. However, the main focus of this research is on the medium voltage network planning. The main objective is to minimize the investment cost, the line loss, and the reliability indices for a study timeframe and to support load growth. The investment cost is related to the distribution network elements such as the transformers, feeders, capacitors, VRs, CCs, and DGs. The voltage drop and the feeder current as the constraints are maintained within their standard range. In addition to minimizing the reliability and line loss costs, the planned network should support a continual growth of loads, which is an essential concern in planning distribution networks. In this thesis, a novel segmentation-based strategy is proposed for including this factor. Using this strategy, the computation time is significantly reduced compared with the exhaustive search method as the accuracy is still acceptable. In addition to being applicable for considering the load growth, this strategy is appropriate for inclusion of practical load characteristic (dynamic), as demonstrated in this thesis. The allocation and sizing problem has a discrete nature with several local minima. This highlights the importance of selecting a proper optimization method. Modified discrete particle swarm optimization as a heuristic method is introduced in this research to solve this complex planning problem. Discrete nonlinear programming and genetic algorithm as an analytical and a heuristic method respectively are also applied to this problem to evaluate the proposed optimization method.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Objective: To (1) search the English-language literature for original research addressing the effect of cryotherapy on joint position sense (JPS) and (2) make recommendations regarding how soon healthy athletes can safely return to participation after cryotherapy. Data Sources: We performed an exhaustive search for original research using the AMED, CINAHL, MEDLINE, and SportDiscus databases from 1973 to 2009 to gather information on cryotherapy and JPS. Key words used were cryotherapy and proprioception, cryotherapy and joint position sense, cryotherapy, and proprioception. Study Selection: The inclusion criteria were (1) the literature was written in English, (2) participants were human, (3) an outcome measure included JPS, (4) participants were healthy, and (5) participants were tested immediately after a cryotherapy application to a joint. Data Extraction: The means and SDs of the JPS outcome measures were extracted and used to estimate the effect size (Cohen d) and associated 95% confidence intervals for comparisons of JPS before and after a cryotherapy treatment. The numbers, ages, and sexes of participants in all 7 selected studies were also extracted. Data Synthesis: The JPS was assessed in 3 joints: ankle (n 5 2), knee (n 5 3), and shoulder (n 5 2). The average effect size for the 7 included studies was modest, with effect sizes ranging from 20.08 to 1.17, with a positive number representing an increase in JPS error. The average methodologic score of the included studies was 5.4/10 (range, 5–6) on the Physiotherapy Evidence Database scale. Conclusions: Limited and equivocal evidence is available to address the effect of cryotherapy on proprioception in the form of JPS. Until further evidence is provided, clinicians should be cautious when returning individuals to tasks requiring components of proprioceptive input immediately after a cryotherapy treatment.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper considers the problem of reconstructing the motion of a 3D articulated tree from 2D point correspondences subject to some temporal prior. Hitherto, smooth motion has been encouraged using a trajectory basis, yielding a hard combinatorial problem with time complexity growing exponentially in the number of frames. Branch and bound strategies have previously attempted to curb this complexity whilst maintaining global optimality. However, they provide no guarantee of being more efficient than exhaustive search. Inspired by recent work which reconstructs general trajectories using compact high-pass filters, we develop a dynamic programming approach which scales linearly in the number of frames, leveraging the intrinsically local nature of filter interactions. Extension to affine projection enables reconstruction without estimating cameras.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Proxy re-encryption (PRE) is a highly useful cryptographic primitive whereby Alice and Bob can endow a proxy with the capacity to change ciphertext recipients from Alice to Bob, without the proxy itself being able to decrypt, thereby providing delegation of decryption authority. Key-private PRE (KP-PRE) specifies an additional level of confidentiality, requiring pseudo-random proxy keys that leak no information on the identity of the delegators and delegatees. In this paper, we propose a CPA-secure PK-PRE scheme in the standard model (which we then transform into a CCA-secure scheme in the random oracle model). Both schemes enjoy highly desirable properties such as uni-directionality and multi-hop delegation. Unlike (the few) prior constructions of PRE and KP-PRE that typically rely on bilinear maps under ad hoc assumptions, security of our construction is based on the hardness of the standard Learning-With-Errors (LWE) problem, itself reducible from worst-case lattice hard problems that are conjectured immune to quantum cryptanalysis, or “post-quantum”. Of independent interest, we further examine the practical hardness of the LWE assumption, using Kannan’s exhaustive search algorithm coupling with pruning techniques. This leads to state-of-the-art parameters not only for our scheme, but also for a number of other primitives based on LWE published the literature.