605 resultados para heuristics


Relevância:

10.00% 10.00%

Publicador:

Resumo:

All rights reserved. In this paper, we propose and study a unified mixed-integer programming model that simultaneously optimizes fluence weights and multi-leaf collimator (MLC) apertures in the treatment planning optimization of VMAT, Tomotherapy, and CyberKnife. The contribution of our model is threefold: (i) Our model optimizes the fluence and MLC apertures simultaneously for a given set of control points. (ii) Our model can incorporate all volume limits or dose upper bounds for organs at risk (OAR) and dose lower bound limits for planning target volumes (PTV) as hard constraints, but it can also relax either of these constraint sets in a Lagrangian fashion and keep the other set as hard constraints. (iii) For faster solutions, we propose several heuristic methods based on the MIP model, as well as a meta-heuristic approach. The meta-heuristic is very efficient in practice, being able to generate dose- and machinery-feasible solutions for problem instances of clinical scale, e.g., obtaining feasible treatment plans to cases with 180 control points, 6750 sample voxels and 18,000 beamlets in 470 seconds, or cases with 72 control points, 8000 sample voxels and 28,800 beamlets in 352 seconds. With discretization and down-sampling of voxels, our method is capable of tackling a treatment field of 8000-64,000cm3, depending on the ratio of critical structure versus unspecified tissues.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Previous research has shown that front-of-pack labels (FoPLs) can assist people to make healthier food choices if they are easy to understand and people are motivated to use them. There is some evidence that FoPLs providing an assessment of a food's health value (evaluative FoPLs) are easier to use than those providing only numerical information on nutrients (reductive FoPLs). Recently, a new evaluative FoPL (the Health Star Rating (HSR)) has been introduced to Australia and New Zealand. The HSR features a summary indicator, differentiating it from many other FoPLs being used around the world. The aim of this study was to understand how consumers of all ages use and make sense of reductive FoPLs and evaluative FoPLs including evaluative FoPLs with and without summary indicators. Ten focus groups were conducted in Perth, Western Australia with adults (n = 50) and children aged 10–17 years (n = 35) to explore reactions to one reductive FoPL (the Daily Intake Guide), an existing evaluative FoPL (multiple traffic lights), and a new evaluative FoPL (the HSR). Participants preferred the evaluative FoPLs over the reductive FoPL, with the strongest preference being for the FoPL with the summary indicator (HSR). Discussions revealed the cognitive strategies used when interpreting each FoPL (e.g., using cut offs, heuristics, and the process of elimination), which differed according to FoPL format. Most participants reported being motivated to use the evaluative FoPLs (particularly the HSR) to make choices about foods consumed as part of regular daily meals, but not for discretionary foods consumed as snacks or deserts. The findings provide further evidence of the potential utility of evaluative FoPLs in supporting healthy food choices and can assist policy makers in selecting between alternative FoPL formats.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Traditionally, diabetes education has relied on written materials, with limited resources available for children with diabetes. Mobile games can be effective and motivating tools for the promotion of children's health. In our earlier work, we proposed a novel approach for designing computer games aimed at educating children with diabetes. In this article, we apply our game design to a mobile Android game (Mario Brothers). We also introduce four heuristics that are specifically designed for evaluating the mobile game, by adapting traditional usability heuristics. Results of a pilot study (n = 12) to evaluate gameplay over 1-week showed that the children found the game engaging and improved their knowledge of healthy diet and lifestyle.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Conspiracy Theory (CT) endorsers believe in an omnipresent, malevolent, and highly coordinated group that wields secret influence for personal gain, and credit this group with the responsibility for many noteworthy events. Two explanations for the emergence of CTs are that they result from social marginalisation and a lack of agency, or that they are due to a need-to-explain-the-unexplained. Furthermore, representativeness heuristics may form reasoning biases that make such beliefs more likely. Two related studies (N = 107; N = 120) examined the relationships between these social marginalisation, intolerance of uncertainty, heuristics and CT belief using a correlational design. Overall, intolerance of uncertainty did not link strongly to CT belief, but worldview variables did - particularly a sense of the world as (socially) threatening, non-random, and with no fixed morality. The use of both representative heuristics that were examined was heightened in those participants more likely to endorse CTs. These factors seem to contribute to the likelihood of whether the individual will endorse CTs generally, relating similarly to common CTs, CTs generally historically accepted as "true", and to the endorsement of fictional CTs that the individual would find novel. Implications are discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Evolutionary algorithms (EAs) have recently been suggested as candidate for solving big data optimisation problems that involve very large number of variables and need to be analysed in a short period of time. However, EAs face scalability issue when dealing with big data problems. Moreover, the performance of EAs critically hinges on the utilised parameter values and operator types, thus it is impossible to design a single EA that can outperform all other on every problem instances. To address these challenges, we propose a heterogeneous framework that integrates a cooperative co-evolution method with various types of memetic algorithms. We use the cooperative co-evolution method to split the big problem into sub-problems in order to increase the efficiency of the solving process. The subproblems are then solved using various heterogeneous memetic algorithms. The proposed heterogeneous framework adaptively assigns, for each solution, different operators, parameter values and local search algorithm to efficiently explore and exploit the search space of the given problem instance. The performance of the proposed algorithm is assessed using the Big Data 2015 competition benchmark problems that contain data with and without noise. Experimental results demonstrate that the proposed algorithm, with the cooperative co-evolution method, performs better than without cooperative co-evolution method. Furthermore, it obtained very competitive results for all tested instances, if not better, when compared to other algorithms using a lower computational times.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Combinatorial optimization is a complex engineering subject. Although formulation often depends on the nature of problems that differs from their setup, design, constraints, and implications, establishing a unifying framework is essential. This dissertation investigates the unique features of three important optimization problems that can span from small-scale design automation to large-scale power system planning: (1) Feeder remote terminal unit (FRTU) planning strategy by considering the cybersecurity of secondary distribution network in electrical distribution grid, (2) physical-level synthesis for microfluidic lab-on-a-chip, and (3) discrete gate sizing in very-large-scale integration (VLSI) circuit. First, an optimization technique by cross entropy is proposed to handle FRTU deployment in primary network considering cybersecurity of secondary distribution network. While it is constrained by monetary budget on the number of deployed FRTUs, the proposed algorithm identi?es pivotal locations of a distribution feeder to install the FRTUs in different time horizons. Then, multi-scale optimization techniques are proposed for digital micro?uidic lab-on-a-chip physical level synthesis. The proposed techniques handle the variation-aware lab-on-a-chip placement and routing co-design while satisfying all constraints, and considering contamination and defect. Last, the first fully polynomial time approximation scheme (FPTAS) is proposed for the delay driven discrete gate sizing problem, which explores the theoretical view since the existing works are heuristics with no performance guarantee. The intellectual contribution of the proposed methods establishes a novel paradigm bridging the gaps between professional communities.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Traditional decision making research has often focused on one's ability to choose from a set of prefixed options, ignoring the process by which decision makers generate courses of action (i.e., options) in-situ (Klein, 1993). In complex and dynamic domains, this option generation process is particularly critical to understanding how successful decisions are made (Zsambok & Klein, 1997). When generating response options for oneself to pursue (i.e., during the intervention-phase of decision making) previous research has supported quick and intuitive heuristics, such as the Take-The-First heuristic (TTF; Johnson & Raab, 2003). When generating predictive options for others in the environment (i.e., during the assessment-phase of decision making), previous research has supported the situational-model-building process described by Long Term Working Memory theory (LTWM; see Ward, Ericsson, & Williams, 2013). In the first three experiments, the claims of TTF and LTWM are tested during assessment- and intervention-phase tasks in soccer. To test what other environmental constraints may dictate the use of these cognitive mechanisms, the claims of these models are also tested in the presence and absence of time pressure. In addition to understanding the option generation process, it is important that researchers in complex and dynamic domains also develop tools that can be used by `real-world' professionals. For this reason, three more experiments were conducted to evaluate the effectiveness of a new online assessment of perceptual-cognitive skill in soccer. This test differentiated between skill groups and predicted performance on a previously established test and predicted option generation behavior. The test also outperformed domain-general cognitive tests, but not a domain-specific knowledge test when predicting skill group membership. Implications for theory and training, and future directions for the development of applied tools are discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this study, interactions between potential hierarchical value chains existing in the production structure and industry-wise productivity growths are sought. We applied generalized Chenery-Watanabe heuristics for matrix linearity maximization to triangulate the input-output incidence matrix for both Japan and the Republic of Korea, finding the potential directed flow of values spanning the industrial sectors of the basic (disaggregated) industry classifications for both countries. Sector specific productivity growths were measured by way of the Trönquvist index, using the 2000-2005 linked input-output tables for both Japan and Korea.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Nearly a third of UK gas and electricity is used in homes, of which 80% is for space heating and hot water provision. Rising consumer bills, concerns about climate change and the surge in personal digital technology use has provoked the development of intelligent domestic heating controls. Whilst the need for having suitable control of the home heating system is essential for reducing domestic energy use, these heating controls rely on appropriate user interaction to achieve a saving and it is unclear whether these ‘smart’ heating controls enhance the use of domestic heating or reduce energy demand. This paper describes qualitative research undertaken with a small sample of UK householders to understand how people use new heating controls installed in their homes and what the requirements are for improved smart heating control design. The paper identifies, against Nielsen’s usability heuristics, the divergence between the householder’s use, understanding and expectations of the heating system and the actual design of the system. Digital and smart heating control systems should be designed to maximise usability so that they can be effectively used for efficient heating control by all users. The research highlights the need for development of new systems to readdress the needs of users and redefine the system requirements.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Random Walk with Restart (RWR) is an appealing measure of proximity between nodes based on graph structures. Since real graphs are often large and subject to minor changes, it is prohibitively expensive to recompute proximities from scratch. Previous methods use LU decomposition and degree reordering heuristics, entailing O(|V|^3) time and O(|V|^2) memory to compute all (|V|^2) pairs of node proximities in a static graph. In this paper, a dynamic scheme to assess RWR proximities is proposed: (1) For unit update, we characterize the changes to all-pairs proximities as the outer product of two vectors. We notice that the multiplication of an RWR matrix and its transition matrix, unlike traditional matrix multiplications, is commutative. This can greatly reduce the computation of all-pairs proximities from O(|V|^3) to O(|delta|) time for each update without loss of accuracy, where |delta| (<<|V|^2) is the number of affected proximities. (2) To avoid O(|V|^2) memory for all pairs of outputs, we also devise efficient partitioning techniques for our dynamic model, which can compute all pairs of proximities segment-wisely within O(l|V|) memory and O(|V|/l) I/O costs, where 1<=l<=|V| is a user-controlled trade-off between memory and I/O costs. (3) For bulk updates, we also devise aggregation and hashing methods, which can discard many unnecessary updates further and handle chunks of unit updates simultaneously. Our experimental results on various datasets demonstrate that our methods can be 1–2 orders of magnitude faster than other competitors while securing scalability and exactness.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The present paper presents an application that composes formal poetry in Spanish in a semiautomatic interactive fashion. JASPER is a forward reasoning rule-based system that obtains from the user an intended message, the desired metric, a choice of vocabulary, and a corpus of verses; and, by intelligent adaptation of selected examples from this corpus using the given words, carries out a prose-to-poetry translation of the given message. In the composition process, JASPER combines natural language generation and a set of construction heuristics obtained from formal literature on Spanish poetry.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Malware replicates itself and produces offspring with the same characteristics but different signatures by using code obfuscation techniques. Current generation anti-virus engines employ a signature-template type detection approach where malware can easily evade existing signatures in the database. This reduces the capability of current anti-virus engines in detecting malware. In this paper, we propose a stepwise binary logistic regression-based dimensionality reduction techniques for malware detection using application program interface (API) call statistics. Finding the most significant malware feature using traditional wrapper-based approaches takes an exponential complexity of the dimension (m) of the dataset with a brute-force search strategies and order of (m-1) complexity with a backward elimination filter heuristics. The novelty of the proposed approach is that it finds the worst case computational complexity which is less than order of (m-1). The proposed approach uses multi-linear regression and the p-value of each individual API feature for selection of the most uncorrelated and significant features in order to reduce the dimensionality of the large malware data and to ensure the absence of multi-collinearity. The stepwise logistic regression approach is then employed to test the significance of the individual malware feature based on their corresponding Wald statistic and to construct the binary decision the model. When the selected most significant APIs are used in a decision rule generation systems, this approach not only reduces the tree size but also improves classification performance. Exhaustive experiments on a large malware data set show that the proposed approach clearly exceeds the existing standard decision rule, support vector machine-based template approach with complete data and provides a better statistical fitness.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Ecological theory often fails applied ecologists in three ways: (1) Theory has little predictive value but is nevertheless applied in conservation with a risk of perverse outcomes, (2) individual theories have limited heuristic value for planning and framing research because they are narrowly focused, and (3) theory can lead to poor communication among scientists and hinder scientific progress through inconsistent use of terms and widespread redundancy. New approaches are therefore needed that improve the distillation, communication, and application of ecological theory. We advocate three approaches to resolve these problems: (1) improve prediction by reviewing theory across case studies to develop contingent theory where possible, (2) plan new research using a checklist of phenomena to avoid the narrow heuristic value of individual theories, and (3) improve communication among scientists by rationalizing theory associated with particular phenomena to purge redundancy and by developing definitions for key terms. We explored the extent to which these problems and solutions have been featured in two case studies of long-term ecological research programs in forests and plantations of southeastern Australia. We found that our main contentions were supported regarding the prediction, planning, and communication limitations of ecological theory. We illustrate how inappropriate application of theory can be overcome or avoided by investment in boundary-spanning actions. The case studies also demonstrate how some of our proposed solutions could work, particularly the use of theory in secondary case studies after developing primary case studies without theory. When properly coordinated and implemented through a widely agreed upon and broadly respected international collaboration, the framework that we present will help to speed the progress of ecological research and lead to better conservation decisions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Les travaux de ce mémoire traitent du problème d’ordonnancement et d’optimisation de la production dans un environnement de plusieurs machines en présence de contraintes sur les ressources matérielles dans une usine d’extrusion plastique. La minimisation de la somme pondérée des retards est le critère économique autour duquel s’articule cette étude car il représente un critère très important pour le respect des délais. Dans ce mémoire, nous proposons une approche exacte via une formulation mathématique capable des donner des solutions optimales et une approche heuristique qui repose sur deux méthodes de construction de solution sérielle et parallèle et un ensemble de méthodes de recherche dans le voisinage (recuit-simulé, recherche avec tabous, GRASP et algorithme génétique) avec cinq variantes de voisinages. Pour être en totale conformité avec la réalité de l’industrie du plastique, nous avons pris en considération certaines caractéristiques très fréquentes telles que les temps de changement d’outils sur les machines lorsqu’un ordre de fabrication succède à un autre sur une machine donnée. La disponibilité des extrudeuses et des matrices d’extrusion représente le goulot d’étranglement dans ce problème d’ordonnancement. Des séries d’expérimentations basées sur des problèmes tests ont été effectuées pour évaluer la qualité de la solution obtenue avec les différents algorithmes proposés. L’analyse des résultats a démontré que les méthodes de construction de solution ne sont pas suffisantes pour assurer de bons résultats et que les méthodes de recherche dans le voisinage donnent des solutions de très bonne qualité. Le choix du voisinage est important pour raffiner la qualité de la solution obtenue. Mots-clés : ordonnancement, optimisation, extrusion, formulation mathématique, heuristique, recuit-simulé, recherche avec tabous, GRASP, algorithme génétique

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Entrepreneurship education has emerged as one popular research domain in academic fields given its aim at enhancing and developing certain entrepreneurial qualities of undergraduates that change their state of behavior, even their entrepreneurial inclination and finally may result in the formation of new businesses as well as new job opportunities. This study attempts to investigate the Colombian student´s entrepreneurial qualities and the influence of entrepreneurial education during their studies.