925 resultados para Search-based algorithms


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Over recent years, Unmanned Air Vehicles or UAVs have become a powerful tool for reconnaissance and surveillance tasks. These vehicles are now available in a broad size and capability range and are intended to fly in regions where the presence of onboard human pilots is either too risky or unnecessary. This paper describes the formulation and application of a design framework that supports the complex task of multidisciplinary design optimisation of UAVs systems via evolutionary computation. The framework includes a Graphical User Interface (GUI), a robust Evolutionary Algorithm optimiser named HAPEA, several design modules, mesh generators and post-processing capabilities in an integrated platform. These population –based algorithms such as EAs are good for cases problems where the search space can be multi-modal, non-convex or discontinuous, with multiple local minima and with noise, and also problems where we look for multiple solutions via Game Theory, namely a Nash equilibrium point or a Pareto set of non-dominated solutions. The application of the methodology is illustrated on conceptual and detailed multi-criteria and multidisciplinary shape design problems. Results indicate the practicality and robustness of the framework to find optimal shapes and trade—offs between the disciplinary analyses and to produce a set of non dominated solutions of an optimal Pareto front to the designer.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Performance comparisons between File Signatures and Inverted Files for text retrieval have previously shown several significant shortcomings of file signatures relative to inverted files. The inverted file approach underpins most state-of-the-art search engine algorithms, such as Language and Probabilistic models. It has been widely accepted that traditional file signatures are inferior alternatives to inverted files. This paper describes TopSig, a new approach to the construction of file signatures. Many advances in semantic hashing and dimensionality reduction have been made in recent times, but these were not so far linked to general purpose, signature file based, search engines. This paper introduces a different signature file approach that builds upon and extends these recent advances. We are able to demonstrate significant improvements in the performance of signature file based indexing and retrieval, performance that is comparable to that of state of the art inverted file based systems, including Language models and BM25. These findings suggest that file signatures offer a viable alternative to inverted files in suitable settings and positions the file signatures model in the class of Vector Space retrieval models.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this paper, we propose a search-based approach to join two tables in the absence of clean join attributes. Non-structured documents from the web are used to express the correlations between a given query and a reference list. To implement this approach, a major challenge we meet is how to efficiently determine the number of times and the locations of each clean reference from the reference list that is approximately mentioned in the retrieved documents. We formalize the Approximate Membership Localization (AML) problem and propose an efficient partial pruning algorithm to solve it. A study using real-word data sets demonstrates the effectiveness of our search-based approach, and the efficiency of our AML algorithm.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In the era of Web 2.0, huge volumes of consumer reviews are posted to the Internet every day. Manual approaches to detecting and analyzing fake reviews (i.e., spam) are not practical due to the problem of information overload. However, the design and development of automated methods of detecting fake reviews is a challenging research problem. The main reason is that fake reviews are specifically composed to mislead readers, so they may appear the same as legitimate reviews (i.e., ham). As a result, discriminatory features that would enable individual reviews to be classified as spam or ham may not be available. Guided by the design science research methodology, the main contribution of this study is the design and instantiation of novel computational models for detecting fake reviews. In particular, a novel text mining model is developed and integrated into a semantic language model for the detection of untruthful reviews. The models are then evaluated based on a real-world dataset collected from amazon.com. The results of our experiments confirm that the proposed models outperform other well-known baseline models in detecting fake reviews. To the best of our knowledge, the work discussed in this article represents the first successful attempt to apply text mining methods and semantic language models to the detection of fake consumer reviews. A managerial implication of our research is that firms can apply our design artifacts to monitor online consumer reviews to develop effective marketing or product design strategies based on genuine consumer feedback posted to the Internet.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background While the burden of chronic cough in children has been documented, etiologic factors across multiple settings and age have not been described. In children with chronic cough, we aimed (1) to evaluate the burden and etiologies using a standard management pathway in various settings, and (2) to determine the influence of age and setting on disease burden and etiologies and etiology on disease burden. We hypothesized that the etiology, but not the burden, of chronic cough in children is dependent on the clinical setting and age. Methods From five major hospitals and three rural-remote clinics, 346 children (mean age 4.5 years) newly referred with chronic cough (> 4 weeks) were prospectively managed in accordance with an evidence-based cough algorithm. We used a priori definitions, timeframes, and validated outcome measures (parent-proxy cough-specific quality of life [PC-QOL], a generic QOL [pediatric quality of life (PedsQL)], and cough diary). Results The burden of chronic cough (PC-QOL, cough duration) significantly differed between settings (P = .014, 0.021, respectively), but was not influenced by age or etiology. PC-QOL and PedsQL did not correlate with age. The frequency of etiologies was significantly different in dissimilar settings (P = .0001); 17.6% of children had a serious underlying diagnosis (bronchiectasis, aspiration, cystic fibrosis). Except for protracted bacterial bronchitis, the frequency of other common diagnoses (asthma, bronchiectasis, resolved without specific-diagnosis) was similar across age categories. Conclusions The high burden of cough is independent of children’s age and etiology but dependent on clinical setting. Irrespective of setting and age, children with chronic cough should be carefully evaluated and child-specific evidence-based algorithms used.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The distribution of black leaf nodes at each level of a linear quadtree is of significant interest in the context of estimation of time and space complexities of linear quadtree based algorithms. The maximum number of black nodes of a given level that can be fitted in a square grid of size 2n × 2n can readily be estimated from the ratio of areas. We show that the actual value of the maximum number of nodes of a level is much less than the maximum obtained from the ratio of the areas. This is due to the fact that the number of nodes possible at a level k, 0≤k≤n − 1, should consider the sum of areas occupied by the actual number of nodes present at levels k + 1, k + 2, …, n − 1.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Cyclostationary analysis has proven effective in identifying signal components for diagnostic purposes. A key descriptor in this framework is the cyclic power spectrum, traditionally estimated by the averaged cyclic periodogram and the smoothed cyclic periodogram. A lengthy debate about the best estimator finally found a solution in a cornerstone work by Antoni, who proposed a unified form for the two families, thus allowing a detailed statistical study of their properties. Since then, the focus of cyclostationary research has shifted towards algorithms, in terms of computational efficiency and simplicity of implementation. Traditional algorithms have proven computationally inefficient and the sophisticated "cyclostationary" definition of these estimators slowed their spread in the industry. The only attempt to increase the computational efficiency of cyclostationary estimators is represented by the cyclic modulation spectrum. This indicator exploits the relationship between cyclostationarity and envelope analysis. The link with envelope analysis allows a leap in computational efficiency and provides a "way in" for the understanding by industrial engineers. However, the new estimator lies outside the unified form described above and an unbiased version of the indicator has not been proposed. This paper will therefore extend the analysis of envelope-based estimators of the cyclic spectrum, proposing a new approach to include them in the unified form of cyclostationary estimators. This will enable the definition of a new envelope-based algorithm and the detailed analysis of the properties of the cyclic modulation spectrum. The computational efficiency of envelope-based algorithms will be also discussed quantitatively for the first time in comparison with the averaged cyclic periodogram. Finally, the algorithms will be validated with numerical and experimental examples.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this paper we propose a general Linear Programming (LP) based formulation and solution methodology for obtaining optimal solution to the load distribution problem in divisible load scheduling. We exploit the power of the versatile LP formulation to propose algorithms that yield exact solutions to several very general load distribution problems for which either no solutions or only heuristic solutions were available. We consider both star (single-level tree) networks and linear daisy chain networks, having processors equipped with front-ends, that form the generic models for several important network topologies. We consider arbitrary processing node availability or release times and general models for communication delays and computation time that account for constant overheads such as start up times in communication and computation. The optimality of the LP based algorithms is proved rigorously.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Today's feature-rich multimedia products require embedded system solution with complex System-on-Chip (SoC) to meet market expectations of high performance at a low cost and lower energy consumption. The memory architecture of the embedded system strongly influences critical system design objectives like area, power and performance. Hence the embedded system designer performs a complete memory architecture exploration to custom design a memory architecture for a given set of applications. Further, the designer would be interested in multiple optimal design points to address various market segments. However, tight time-to-market constraints enforces short design cycle time. In this paper we address the multi-level multi-objective memory architecture exploration problem through a combination of exhaustive-search based memory exploration at the outer level and a two step based integrated data layout for SPRAM-Cache based architectures at the inner level. We present a two step integrated approach for data layout for SPRAM-Cache based hybrid architectures with the first step as data-partitioning that partitions data between SPRAM and Cache, and the second step is the cache conscious data layout. We formulate the cache-conscious data layout as a graph partitioning problem and show that our approach gives up to 34% improvement over an existing approach and also optimizes the off-chip memory address space. We experimented our approach with 3 embedded multimedia applications and our approach explores several hundred memory configurations for each application, yielding several optimal design points in a few hours of computation on a standard desktop.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The aim in this paper is to allocate the `sleep time' of the individual sensors in an intrusion detection application so that the energy consumption from the sensors is reduced, while keeping the tracking error to a minimum. We propose two novel reinforcement learning (RL) based algorithms that attempt to minimize a certain long-run average cost objective. Both our algorithms incorporate feature-based representations to handle the curse of dimensionality associated with the underlying partially-observable Markov decision process (POMDP). Further, the feature selection scheme used in our algorithms intelligently manages the energy cost and tracking cost factors, which in turn assists the search for the optimal sleeping policy. We also extend these algorithms to a setting where the intruder's mobility model is not known by incorporating a stochastic iterative scheme for estimating the mobility model. The simulation results on a synthetic 2-d network setting are encouraging.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We consider the problem of optimizing the workforce of a service system. Adapting the staffing levels in such systems is non-trivial due to large variations in workload and the large number of system parameters do not allow for a brute force search. Further, because these parameters change on a weekly basis, the optimization should not take longer than a few hours. Our aim is to find the optimum staffing levels from a discrete high-dimensional parameter set, that minimizes the long run average of the single-stage cost function, while adhering to the constraints relating to queue stability and service-level agreement (SLA) compliance. The single-stage cost function balances the conflicting objectives of utilizing workers better and attaining the target SLAs. We formulate this problem as a constrained parameterized Markov cost process parameterized by the (discrete) staffing levels. We propose novel simultaneous perturbation stochastic approximation (SPSA)-based algorithms for solving the above problem. The algorithms include both first-order as well as second-order methods and incorporate SPSA-based gradient/Hessian estimates for primal descent, while performing dual ascent for the Lagrange multipliers. Both algorithms are online and update the staffing levels in an incremental fashion. Further, they involve a certain generalized smooth projection operator, which is essential to project the continuous-valued worker parameter tuned by our algorithms onto the discrete set. The smoothness is necessary to ensure that the underlying transition dynamics of the constrained Markov cost process is itself smooth (as a function of the continuous-valued parameter): a critical requirement to prove the convergence of both algorithms. We validate our algorithms via performance simulations based on data from five real-life service systems. For the sake of comparison, we also implement a scatter search based algorithm using state-of-the-art optimization tool-kit OptQuest. From the experiments, we observe that both our algorithms converge empirically and consistently outperform OptQuest in most of the settings considered. This finding coupled with the computational advantage of our algorithms make them amenable for adaptive labor staffing in real-life service systems.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper is in two parts and addresses two of getting more information out of the RF signal from three-dimensional (3D) mechanically-swept medical ultrasound . The first topic is the use of non-blind deconvolution improve the clarity of the data, particularly in the direction to the individual B-scans. The second topic is imaging. We present a robust and efficient approach to estimation and display of axial strain information. deconvolution, we calculate an estimate of the point-spread at each depth in the image using Field II. This is used as of an Expectation Maximisation (EM) framework in which ultrasound scatterer field is modelled as the product of (a) a smooth function and (b) a fine-grain varying function. the E step, a Wiener filter is used to estimate the scatterer based on an assumed piecewise smooth component. In the M , wavelet de-noising is used to estimate the piecewise smooth from the scatterer field. strain imaging, we use a quasi-static approach with efficient based algorithms. Our contributions lie in robust and 3D displacement tracking, point-wise quality-weighted , and a stable display that shows not only strain but an indication of the quality of the data at each point in the . This enables clinicians to see where the strain estimate is and where it is mostly noise. deconvolution, we present in-vivo images and simulations quantitative performance measures. With the blurred 3D taken as OdB, we get an improvement in signal to noise ratio 4.6dB with a Wiener filter alone, 4.36dB with the ForWaRD and S.18dB with our EM algorithm. For strain imaging show images based on 2D and 3D data and describe how full D analysis can be performed in about 20 seconds on a typical . We will also present initial results of our clinical study to explore the applications of our system in our local hospital. © 2008 IEEE.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Along with the growing demand for cryptosystems in systems ranging from large servers to mobile devices, suitable cryptogrophic protocols for use under certain constraints are becoming more and more important. Constraints such as calculation time, area, efficiency and security, must be considered by the designer. Elliptic curves, since their introduction to public key cryptography in 1985 have challenged established public key and signature generation schemes such as RSA, offering more security per bit. Amongst Elliptic curve based systems, pairing based cryptographies are thoroughly researched and can be used in many public key protocols such as identity based schemes. For hardware implementions of pairing based protocols, all components which calculate operations over Elliptic curves can be considered. Designers of the pairing algorithms must choose calculation blocks and arrange the basic operations carefully so that the implementation can meet the constraints of time and hardware resource area. This thesis deals with different hardware architectures to accelerate the pairing based cryptosystems in the field of characteristic two. Using different top-level architectures the hardware efficiency of operations that run at different times is first considered in this thesis. Security is another important aspect of pairing based cryptography to be considered in practically Side Channel Analysis (SCA) attacks. The naively implemented hardware accelerators for pairing based cryptographies can be vulnerable when taking the physical analysis attacks into consideration. This thesis considered the weaknesses in pairing based public key cryptography and addresses the particular calculations in the systems that are insecure. In this case, countermeasures should be applied to protect the weak link of the implementation to improve and perfect the pairing based algorithms. Some important rules that the designers must obey to improve the security of the cryptosystems are proposed. According to these rules, three countermeasures that protect the pairing based cryptosystems against SCA attacks are applied. The implementations of the countermeasures are presented and their performances are investigated.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The characterization of thermocouple sensors for temperature measurement in variable flow environments is a challenging problem. In this paper, novel difference equation-based algorithms are presented that allow in situ characterization of temperature measurement probes consisting of two-thermocouple sensors with differing time constants. Linear and non-linear least squares formulations of the characterization problem are introduced and compared in terms of their computational complexity, robustness to noise and statistical properties. With the aid of this analysis, least squares optimization procedures that yield unbiased estimates are identified. The main contribution of the paper is the development of a linear two-parameter generalized total least squares formulation of the sensor characterization problem. Monte-Carlo simulation results are used to support the analysis.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In recent years, gradient vector flow (GVF) based algorithms have been successfully used to segment a variety of 2-D and 3-D imagery. However, due to the compromise of internal and external energy forces within the resulting partial differential equations, these methods may lead to biased segmentation results. In this paper, we propose MSGVF, a mean shift based GVF segmentation algorithm that can successfully locate the correct borders. MSGVF is developed so that when the contour reaches equilibrium, the various forces resulting from the different energy terms are balanced. In addition, the smoothness constraint of image pixels is kept so that over- or under-segmentation can be reduced. Experimental results on publicly accessible datasets of dermoscopic and optic disc images demonstrate that the proposed method effectively detects the borders of the objects of interest.