605 resultados para heuristics


Relevância:

10.00% 10.00%

Publicador:

Resumo:

A job shop with one batch processing and several discrete machines is analyzed. Given a set of jobs, their process routes, processing requirements, and size, the objective is to schedule the jobs such that the makespan is minimized. The batch processing machine can process a batch of jobs as long as the machine capacity is not violated. The batch processing time is equal to the longest processing job in the batch. The problem under study can be represented as Jm:batch:Cmax. If no batches were formed, the scheduling problem under study reduces to the classical job shop scheduling problem (i.e. Jm:: Cmax), which is known to be NP-hard. This research extends the scheduling literature by combining Jm::Cmax with batch processing. The primary contributions are the mathematical formulation, a new network representation and several solution approaches. The problem under study is observed widely in metal working and other industries, but received limited or no attention due to its complexity. A novel network representation of the problem using disjunctive and conjunctive arcs, and a mathematical formulation are proposed to minimize the makespan. Besides that, several algorithms, like batch forming heuristics, dispatching rules, Modified Shifting Bottleneck, Tabu Search (TS) and Simulated Annealing (SA), were developed and implemented. An experimental study was conducted to evaluate the proposed heuristics, and the results were compared to those from a commercial solver (i.e., CPLEX). TS and SA, with the combination of MWKR-FF as the initial solution, gave the best solutions among all the heuristics proposed. Their results were close to CPLEX; and for some larger instances, with total operations greater than 225, they were competitive in terms of solution quality and runtime. For some larger problem instances, CPLEX was unable to report a feasible solution even after running for several hours. Between SA and the experimental study indicated that SA produced a better average Cmax for all instances. The solution approaches proposed will benefit practitioners to schedule a job shop (with both discrete and batch processing machines) more efficiently. The proposed solution approaches are easier to implement and requires short run times to solve large problem instances.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This research aims at a study of the hybrid flow shop problem which has parallel batch-processing machines in one stage and discrete-processing machines in other stages to process jobs of arbitrary sizes. The objective is to minimize the makespan for a set of jobs. The problem is denoted as: FF: batch1,sj:Cmax. The problem is formulated as a mixed-integer linear program. The commercial solver, AMPL/CPLEX, is used to solve problem instances to their optimality. Experimental results show that AMPL/CPLEX requires considerable time to find the optimal solution for even a small size problem, i.e., a 6-job instance requires 2 hours in average. A bottleneck-first-decomposition heuristic (BFD) is proposed in this study to overcome the computational (time) problem encountered while using the commercial solver. The proposed BFD heuristic is inspired by the shifting bottleneck heuristic. It decomposes the entire problem into three sub-problems, and schedules the sub-problems one by one. The proposed BFD heuristic consists of four major steps: formulating sub-problems, prioritizing sub-problems, solving sub-problems and re-scheduling. For solving the sub-problems, two heuristic algorithms are proposed; one for scheduling a hybrid flow shop with discrete processing machines, and the other for scheduling parallel batching machines (single stage). Both consider job arrival and delivery times. An experiment design is conducted to evaluate the effectiveness of the proposed BFD, which is further evaluated against a set of common heuristics including a randomized greedy heuristic and five dispatching rules. The results show that the proposed BFD heuristic outperforms all these algorithms. To evaluate the quality of the heuristic solution, a procedure is developed to calculate a lower bound of makespan for the problem under study. The lower bound obtained is tighter than other bounds developed for related problems in literature. A meta-search approach based on the Genetic Algorithm concept is developed to evaluate the significance of further improving the solution obtained from the proposed BFD heuristic. The experiment indicates that it reduces the makespan by 1.93 % in average within a negligible time when problem size is less than 50 jobs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A trial judge serves as gatekeeper in the courtroom to ensure that only reliable expert witness testimony is presented to the jury. Nevertheless, research shows that while judges take seriously their gatekeeper status, legal professionals in general are unable to identify well conducted research and are unable to define falsifiability, error rates, peer review status, and scientific validity (Gatkowski et al., 2001; Kovera & McAuliff, 2000). However, the abilities to identify quality scientific research and define scientific concepts are critical to preventing "junk" science from entering courtrooms. Research thus far has neglected to address that before selecting expert witnesses, judges and attorneys must first evaluate experts' CVs rather than their scientific testimony to determine whether legal standards of admissibility have been met. The quality of expert testimony, therefore, largely depends on the ability to evaluate properly experts' credentials. Theoretical models of decision making suggest that ability/knowledge and motivation are required to process information systematically. Legal professionals (judges and attorneys) were expected to process CVs heuristically when rendering expert witness decisions due to a lack of training in areas of psychology expertise.^ Legal professionals' (N = 150) and undergraduate students' (N = 468) expert witness decisions were examined and compared. Participants were presented with one of two versions of a criminal case calling for the testimony of either a clinical psychology expert or an experimental legal psychology expert. Participants then read one of eight curricula vitae that varied area of expertise (clinical vs. legal psychology), previous expert witness experience (previous experience vs. no previous experience), and scholarly publication record (30 publications vs. no publications) before deciding whether the expert was qualified to testify in the case. Follow-up measures assessed participants' decision making processes.^ Legal professionals were not better than college students at rendering quality psychology expert witness admissibility decisions yet they were significantly more confident in their decisions. Legal professionals rated themselves significantly higher than students in ability, knowledge, and motivation to choose an appropriate psychology expert although their expert witness decisions were equally inadequate. Findings suggest that participants relied on heuristics, such as previous expert witness experience, to render decisions.^

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present our approach to real-time service-oriented scheduling problems with the objective of maximizing the total system utility. Different from the traditional utility accrual scheduling problems that each task is associated with only a single time utility function (TUF), we associate two different TUFs—a profit TUF and a penalty TUF—with each task, to model the real-time services that not only need to reward the early completions but also need to penalize the abortions or deadline misses. The scheduling heuristics we proposed in this paper judiciously accept, schedule, and abort real-time services when necessary to maximize the accrued utility. Our extensive experimental results show that our proposed algorithms can significantly outperform the traditional scheduling algorithms such as the Earliest Deadline First (EDF), the traditional utility accrual (UA) scheduling algorithms, and an earlier scheduling approach based on a similar model.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Unified Modeling Language (UML) has quickly become the industry standard for object-oriented software development. It is being widely used in organizations and institutions around the world. However, UML is often found to be too complex for novice systems analysts. Although prior research has identified difficulties novice analysts encounter in learning UML, no viable solution has been proposed to address these difficulties. Sequence-diagram modeling, in particular, has largely been overlooked. The sequence diagram models the behavioral aspects of an object-oriented software system in terms of interactions among its building blocks, i.e. objects and classes. It is one of the most commonly-used UML diagrams in practice. However, there has been little research on sequence-diagram modeling. The current literature scarcely provides effective guidelines for developing a sequence diagram. Such guidelines will be greatly beneficial to novice analysts who, unlike experienced systems analysts, do not possess relevant prior experience to easily learn how to develop a sequence diagram. There is the need for an effective sequence-diagram modeling technique for novices. This dissertation reports a research study that identified novice difficulties in modeling a sequence diagram and proposed a technique called CHOP (CHunking, Ordering, Patterning), which was designed to reduce the cognitive load by addressing the cognitive complexity of sequence-diagram modeling. The CHOP technique was evaluated in a controlled experiment against a technique recommended in a well-known textbook, which was found to be representative of approaches provided in many textbooks as well as practitioner literatures. The results indicated that novice analysts were able to perform better using the CHOP technique. This outcome seems have been enabled by pattern-based heuristics provided by the technique. Meanwhile, novice analysts rated the CHOP technique more useful although not significantly easier to use than the control technique. The study established that the CHOP technique is an effective sequence-diagram modeling technique for novice analysts.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Database design is a difficult problem for non-expert designers. It is desirable to assist such designers during the problem solving process by means of a knowledge based (KB) system. Although a number of prototype KB systems have been proposed, there are many shortcomings. Firstly, few have incorporated sufficient expertise in modeling relationships, particularly higher order relationships. Secondly, there does not seem to be any published empirical study that experimentally tested the effectiveness of any of these KB tools. Thirdly, problem solving behavior of non-experts, whom the systems were intended to assist, has not been one of the bases for system design. In this project, a consulting system, called CODA, for conceptual database design that addresses the above short comings was developed and empirically validated. More specifically, the CODA system incorporates (a) findings on why non-experts commit errors and (b) heuristics for modeling relationships. Two approaches to knowledge base implementation were used and compared in this project, namely system restrictiveness and decisional guidance (Silver 1990). The Restrictive system uses a proscriptive approach and limits the designer's choices at various design phases by forcing him/her to follow a specific design path. The Guidance system approach, which is less restrictive, involves providing context specific, informative and suggestive guidance throughout the design process. Both the approaches would prevent erroneous design decisions. The main objectives of the study are to evaluate (1) whether the knowledge-based system is more effective than the system without a knowledge-base and (2) which approach to knowledge implementation - whether Restrictive or Guidance - is more effective. To evaluate the effectiveness of the knowledge base itself, the systems were compared with a system that does not incorporate the expertise (Control). An experimental procedure using student subjects was used to test the effectiveness of the systems. The subjects solved a task without using the system (pre-treatment task) and another task using one of the three systems, viz. Control, Guidance or Restrictive (experimental task). Analysis of experimental task scores of those subjects who performed satisfactorily in the pre-treatment task revealed that the knowledge based approach to database design support lead to more accurate solutions than the control system. Among the two KB approaches, Guidance approach was found to lead to better performance when compared to the Control system. It was found that the subjects perceived the Restrictive system easier to use than the Guidance system.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose: The purpose of this paper is to ascertain how today’s international marketers can perform better on the global scene by harnessing spontaneity. Design/methodology/approach: The authors draw on contingency theory to develop a model of the spontaneity – international marketing performance relationship, and identify three potential moderators, namely, strategic planning, centralization, and market dynamism. The authors test the model via structural equation modeling with survey data from 197 UK exporters. Findings: The results indicate that spontaneity is beneficial to exporters in terms of enhancing profit performance. In addition, greater centralization and strategic planning strengthen the positive effects of spontaneity. However, market dynamism mitigates the positive effect of spontaneity on export performance (when customer needs are volatile, spontaneous decisions do not function as well in terms of ensuring success). Practical implications: Learning to be spontaneous when making export decisions appears to result in favorable outcomes for the export function. To harness spontaneity, export managers should look to develop company heuristics (increase centralization and strategic planning). Finally, if operating in dynamic export market environments, the role of spontaneity is weaker, so more conventional decision-making approaches should be adopted. Originality/value: The international marketing environment typically requires decisions to be flexible and fast. In this context, spontaneity could enable accelerated and responsive decision-making, allowing international marketers to realize superior performance. Yet, there is a lack of research on decision-making spontaneity and its potential for international marketing performance enhancement.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This book constitutes the refereed proceedings of the 14th International Conference on Parallel Problem Solving from Nature, PPSN 2016, held in Edinburgh, UK, in September 2016. The total of 93 revised full papers were carefully reviewed and selected from 224 submissions. The meeting began with four workshops which offered an ideal opportunity to explore specific topics in intelligent transportation Workshop, landscape-aware heuristic search, natural computing in scheduling and timetabling, and advances in multi-modal optimization. PPSN XIV also included sixteen free tutorials to give us all the opportunity to learn about new aspects: gray box optimization in theory; theory of evolutionary computation; graph-based and cartesian genetic programming; theory of parallel evolutionary algorithms; promoting diversity in evolutionary optimization: why and how; evolutionary multi-objective optimization; intelligent systems for smart cities; advances on multi-modal optimization; evolutionary computation in cryptography; evolutionary robotics - a practical guide to experiment with real hardware; evolutionary algorithms and hyper-heuristics; a bridge between optimization over manifolds and evolutionary computation; implementing evolutionary algorithms in the cloud; the attainment function approach to performance evaluation in EMO; runtime analysis of evolutionary algorithms: basic introduction; meta-model assisted (evolutionary) optimization. The papers are organized in topical sections on adaption, self-adaption and parameter tuning; differential evolution and swarm intelligence; dynamic, uncertain and constrained environments; genetic programming; multi-objective, many-objective and multi-level optimization; parallel algorithms and hardware issues; real-word applications and modeling; theory; diversity and landscape analysis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Les réseaux de capteurs sont formés d’un ensemble de dispositifs capables de prendre individuellement des mesures d’un environnement particulier et d’échanger de l’information afin d’obtenir une représentation de haut niveau sur les activités en cours dans la zone d’intérêt. Une telle détection distribuée, avec de nombreux appareils situés à proximité des phénomènes d’intérêt, est pertinente dans des domaines tels que la surveillance, l’agriculture, l’observation environnementale, la surveillance industrielle, etc. Nous proposons dans cette thèse plusieurs approches pour effectuer l’optimisation des opérations spatio-temporelles de ces dispositifs, en déterminant où les placer dans l’environnement et comment les contrôler au fil du temps afin de détecter les cibles mobiles d’intérêt. La première nouveauté consiste en un modèle de détection réaliste représentant la couverture d’un réseau de capteurs dans son environnement. Nous proposons pour cela un modèle 3D probabiliste de la capacité de détection d’un capteur sur ses abords. Ce modèle inègre également de l’information sur l’environnement grâce à l’évaluation de la visibilité selon le champ de vision. À partir de ce modèle de détection, l’optimisation spatiale est effectuée par la recherche du meilleur emplacement et l’orientation de chaque capteur du réseau. Pour ce faire, nous proposons un nouvel algorithme basé sur la descente du gradient qui a été favorablement comparée avec d’autres méthodes génériques d’optimisation «boites noires» sous l’aspect de la couverture du terrain, tout en étant plus efficace en terme de calculs. Une fois que les capteurs placés dans l’environnement, l’optimisation temporelle consiste à bien couvrir un groupe de cibles mobiles dans l’environnement. D’abord, on effectue la prédiction de la position future des cibles mobiles détectées par les capteurs. La prédiction se fait soit à l’aide de l’historique des autres cibles qui ont traversé le même environnement (prédiction à long terme), ou seulement en utilisant les déplacements précédents de la même cible (prédiction à court terme). Nous proposons de nouveaux algorithmes dans chaque catégorie qui performent mieux ou produits des résultats comparables par rapport aux méthodes existantes. Une fois que les futurs emplacements de cibles sont prédits, les paramètres des capteurs sont optimisés afin que les cibles soient correctement couvertes pendant un certain temps, selon les prédictions. À cet effet, nous proposons une méthode heuristique pour faire un contrôle de capteurs, qui se base sur les prévisions probabilistes de trajectoire des cibles et également sur la couverture probabiliste des capteurs des cibles. Et pour terminer, les méthodes d’optimisation spatiales et temporelles proposées ont été intégrées et appliquées avec succès, ce qui démontre une approche complète et efficace pour l’optimisation spatio-temporelle des réseaux de capteurs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Thesis (Master's)--University of Washington, 2016-06

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We examine how using information on unconstrained demand can improve operational decisions. Specifically, we examine the widespread problem of developing course schedules in not-for-profit university settings. We investigate the potential benefit of incorporating, into the scheduling process, information on the unconstrained demand of students for courses. Prior to this study, the status quo in our college, like that in a large proportion of university settings, was building the course schedule to avoid time conflicts between required courses and to minimize time conflicts between designated groups of courses, such as electives in a particular area. Compared to the status quo approach, we find that, based on three semester's worth of actual data, an approach that explicitly considers students’ course preferences improves a student-based metric of schedule quality on the order of over 4% (which is the equivalent, in our setting, of improving service for over 20% of students).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-08

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Behavioral Finance develop as it is perceived anomalies in these markets efficient. This fields of study can be grouped into three major groups: heuristic bias, tying the shape and inefficient markets. The present study focuses on issues concerning the heuristics of representativeness and anchoring. This study aimed to identify the then under-reaction and over-reaction, as well as the existence of symmetry in the active first and second line of the Brazilian stock market. For this, it will be use the Fuzzy Logic and the indicators that classify groups studied from the Discriminant Analysis. The highest present, indicator in the period studied, was the Liabilities / Equity, demonstrating the importance of the moment to discriminate the assets to be considered "winners" and "losers." Note that in the MLCX biases over-reaction is concentrated in the period of financial crisis, and in the remaining periods of statistically significant biases, are obtained by sub-reactions. The latter would be in times of moderate levels of uncertainty. In the Small Caps the behavioral responses in 2005 and 2007 occur in reverse to those observed in the Mid-Large Cap. Now in times of crisis would have a marked conservatism while near the end of trading on the Bovespa speaker, accompanied by an increase of negotiations, there is an overreaction by investors. The other heuristics in SMLL occurred at the end of the period studied, this being a under-reaction and the other a over-reaction and the second occurring in a period of financial-economic more positive than the first. As regards the under / over-reactivity in both types, there is detected a predominance of either, which probably be different in the context in MLCX without crisis. For the period in which such phenomena occur in a statistically significant to note that, in most cases, such phenomena occur during the periods for MLCX while in SMLL not only biases are less present as there is no concentration of these at any time . Given the above, it is believed that while detecting the presence of bias behavior at certain times, these do not tend to appear to a specific type or heuristics and while there were some indications of a seasonal pattern in Mid- Large Caps, the same behavior does not seem to be repeated in Small Caps. The tests would then suggest that momentary failures in the Efficient Market Hypothesis when tested in semistrong form as stated by Behavioral Finance. This result confirms the theory by stating that not only rationality, but also human irrationality, is limited because it would act rationally in many circumstances

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Reverse engineering is usually the stepping stone of a variety of at-tacks aiming at identifying sensitive information (keys, credentials, data, algo-rithms) or vulnerabilities and flaws for broader exploitation. Software applica-tions are usually deployed as identical binary code installed on millions of com-puters, enabling an adversary to develop a generic reverse-engineering strategy that, if working on one code instance, could be applied to crack all the other in-stances. A solution to mitigate this problem is represented by Software Diversity, which aims at creating several structurally different (but functionally equivalent) binary code versions out of the same source code, so that even if a successful attack can be elaborated for one version, it should not work on a diversified ver-sion. In this paper, we address the problem of maximizing software diversity from a search-based optimization point of view. The program to protect is subject to a catalogue of transformations to generate many candidate versions. The problem of selecting the subset of most diversified versions to be deployed is formulated as an optimisation problem, that we tackle with different search heuristics. We show the applicability of this approach on some popular Android apps.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In 2014 over 70% of people in Great Britain accessed the Internet every day. This resource is an optimal vector for malicious attackers to penetrate home computers and as such compromised pages have been increasing in both number and complexity. This paper presents X-Secure, a novel browser plug-in designed to present and raise the awareness of inexperienced users by analysing web-pages before malicious scripts are executed by the host computer. X-Secure was able to detect over 90% of the tested attacks and provides a danger level based on cumulative analysis of the source code, the URL, and the remote server, by using a set of heuristics, hence increasing the situational awareness of users browsing the internet.