985 resultados para Heuristic


Relevância:

10.00% 10.00%

Publicador:

Resumo:

A Software-as-a-Service or SaaS can be delivered in a composite form, consisting of a set of application and data components that work together to deliver higher-level functional software. Components in a composite SaaS may need to be scaled – replicated or deleted, to accommodate the user’s load. It may not be necessary to replicate all components of the SaaS, as some components can be shared by other instances. On the other hand, when the load is low, some of the instances may need to be deleted to avoid resource underutilisation. Thus, it is important to determine which components are to be scaled such that the performance of the SaaS is still maintained. Extensive research on the SaaS resource management in Cloud has not yet addressed the challenges of scaling process for composite SaaS. Therefore, a hybrid genetic algorithm is proposed in which it utilises the problem’s knowledge and explores the best combination of scaling plan for the components. Experimental results demonstrate that the proposed algorithm outperforms existing heuristic-based solutions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We investigate the utility to computational Bayesian analyses of a particular family of recursive marginal likelihood estimators characterized by the (equivalent) algorithms known as "biased sampling" or "reverse logistic regression" in the statistics literature and "the density of states" in physics. Through a pair of numerical examples (including mixture modeling of the well-known galaxy dataset) we highlight the remarkable diversity of sampling schemes amenable to such recursive normalization, as well as the notable efficiency of the resulting pseudo-mixture distributions for gauging prior-sensitivity in the Bayesian model selection context. Our key theoretical contributions are to introduce a novel heuristic ("thermodynamic integration via importance sampling") for qualifying the role of the bridging sequence in this procedure, and to reveal various connections between these recursive estimators and the nested sampling technique.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Several analytical methods for Dynamic System Optimum (DSO) assignment have been proposed but they are basically classified into two kinds. This chapter attempts to establish DSO by equilbrating the path dynamic marginal time (DMT). The authors analyze the path DMT for a single path with tandem bottlenecks and showed that the path DMT is not the simple summation of DMT associated with each bottleneck along the path. Next, the authors examined the DMT of several paths passing through a common bottleneck. It is shown that the externality at the bottleneck is shared by the paths in proportion to their demand from the current time until the queue vanishes. This share of the externality is caused by the departure rate shift under first in first out (FIFO) and the externality propagates to the downstream bottlenecks. However, the externalities propagates to the downstream are calculated out if downstream bottlenecks exist. Therefore, the authors concluded that the path DMT can be evaluated without considering the propagation of the externalities, but just as in the evaluation of the path DMT for a single path passing through a series of bottlenecks between the origin and destination. Based on the DMT analysis, the authors finally proposed a heuristic solution algorithm and verified it by comparing the numerical solution with the analytical one.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper gives an overview of an ongoing project endeavouring to advance theory-based production and project management, and the rationale for this approach is briefly justified. The status of the theoretical foundation of production management, project management and allied disciplines is discussed, with emphasis on metaphysical grounding of theories, as well as the nature of the heuristic solution method commonly used in these disciplines. Then, on-going work related to different aspects of production and project management is reviewed from both theoretical and practical orientation. Next, information systems agile project management is explored with a view to its re-use in generic project management. In production management, the consequences and implementation of a new, wider theoretical basis are analyzed. The theoretical implications and negative symptoms of the peculiarities of the construction industry for supply chains and supply chain management in construction are observed. Theoretical paths for improvements of inter-organisational relationships in construction which are fundamental for improvement of construction supply chains are described. To conclude, the observations made in this paper vis-à-vis production, project and supply chain management are related again to the theoretical basis of this paper, and finally directions for theory development and future research are given and discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We report the study of the thermal transport management of monolayer graphene allotrope nanoribbons (size ∼20 × 4 nm2) by the modulation of their structures via molecular dynamics simulations. The thermal conductivity of graphyne (GY)-like geometries is observed to decrease monotonously with increasing number of acetylenic linkages between adjacent hexagons. Strikingly, by incorporating those GY or GY-like structures, the thermal performance of graphene can be effectively engineered. The resulting hetero-junctions possess a sharp local temperature jump at the interface, and show a much lower effective thermal conductivity due to the enhanced phonon–phonon scattering. More importantly, by controlling the percentage, type and distribution pattern of the GY or GY-like structures, the hetero-junctions are found to exhibit tunable thermal transport properties (including the effective thermal conductivity, interfacial thermal resistance and rectification). This study provides a heuristic guideline to manipulate the thermal properties of 2D carbon networks, ideal for application in thermoelectric devices with strongly suppressed thermal conductivity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The term “Human error” can simply be defined as an error which made by a human. In fact, Human error is an explanation of malfunctions, unintended consequents from operating a system. There are many factors that cause a person to have an error due to the unwanted error of human. The aim of this paper is to investigate the relationship of human error as one of the factors to computer related abuses. The paper beings by computer-relating to human errors and followed by mechanism mitigate these errors through social and technical perspectives. We present the 25 techniques of computer crime prevention, as a heuristic device that assists. A last section discussing the ways of improving the adoption of security, and conclusion.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The paper examines the knowledge of pedestrian movements, both in real scenarios, and from more recent years, in the virtual 4 simulation realm. Aiming to verify whether it is possible to learn from the study of virtual environments how people will behave in real 5 environments, it is vital to understand what is already known about behavior in real environments. Besides the walking interaction among 6 pedestrians, the interaction between pedestrians and the built environment in which they are walking also have greatest relevance. Force-based 7 models were compared with the other three major microscopic models of pedestrian simulation to demonstrate a more realistic and capable 8 heuristic approach is needed for the study of the dynamics of pedestrians.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Live migration of multiple Virtual Machines (VMs) has become an integral management activity in data centers for power saving, load balancing and system maintenance. While state-of-the-art live migration techniques focus on the improvement of migration performance of an independent single VM, only a little has been investigated to the case of live migration of multiple interacting VMs. Live migration is mostly influenced by the network bandwidth and arbitrarily migrating a VM which has data inter-dependencies with other VMs may increase the bandwidth consumption and adversely affect the performances of subsequent migrations. In this paper, we propose a Random Key Genetic Algorithm (RKGA) that efficiently schedules the migration of a given set of VMs accounting both inter-VM dependency and data center communication network. The experimental results show that the RKGA can schedule the migration of multiple VMs with significantly shorter total migration time and total downtime compared to a heuristic algorithm.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Determination of sequence similarity is a central issue in computational biology, a problem addressed primarily through BLAST, an alignment based heuristic which has underpinned much of the analysis and annotation of the genomic era. Despite their success, alignment-based approaches scale poorly with increasing data set size, and are not robust under structural sequence rearrangements. Successive waves of innovation in sequencing technologies – so-called Next Generation Sequencing (NGS) approaches – have led to an explosion in data availability, challenging existing methods and motivating novel approaches to sequence representation and similarity scoring, including adaptation of existing methods from other domains such as information retrieval. In this work, we investigate locality-sensitive hashing of sequences through binary document signatures, applying the method to a bacterial protein classification task. Here, the goal is to predict the gene family to which a given query protein belongs. Experiments carried out on a pair of small but biologically realistic datasets (the full protein repertoires of families of Chlamydia and Staphylococcus aureus genomes respectively) show that a measure of similarity obtained by locality sensitive hashing gives highly accurate results while offering a number of avenues which will lead to substantial performance improvements over BLAST..

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Network Real-Time Kinematic (NRTK) is a technology that can provide centimeter-level accuracy positioning services in real time, and it is enabled by a network of Continuously Operating Reference Stations (CORS). The location-oriented CORS placement problem is an important problem in the design of a NRTK as it will directly affect not only the installation and operational cost of the NRTK, but also the quality of positioning services provided by the NRTK. This paper presents a Memetic Algorithm (MA) for the location-oriented CORS placement problem, which hybridizes the powerful explorative search capacity of a genetic algorithm and the efficient and effective exploitative search capacity of a local optimization. Experimental results have shown that the MA has better performance than existing approaches. In this paper we also conduct an empirical study about the scalability of the MA, effectiveness of the hybridization technique and selection of crossover operator in the MA.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the past few years, there has been a steady increase in the attention, importance and focus of green initiatives related to data centers. While various energy aware measures have been developed for data centers, the requirement of improving the performance efficiency of application assignment at the same time has yet to be fulfilled. For instance, many energy aware measures applied to data centers maintain a trade-off between energy consumption and Quality of Service (QoS). To address this problem, this paper presents a novel concept of profiling to facilitate offline optimization for a deterministic application assignment to virtual machines. Then, a profile-based model is established for obtaining near-optimal allocations of applications to virtual machines with consideration of three major objectives: energy cost, CPU utilization efficiency and application completion time. From this model, a profile-based and scalable matching algorithm is developed to solve the profile-based model. The assignment efficiency of our algorithm is then compared with that of the Hungarian algorithm, which does not scale well though giving the optimal solution.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this chapter, we discuss an approach for teaching pre-service teachers how to critically reflect on their experiences in a Service-learning program in an advanced subject about inclusive education. The approach was informed by critical social theory, with the expectation that students would engage in transformational learning. By explicitly teaching the students to engage in critical reflective thinking (Fishbowl discussions) and examine the depth of their critical reflection against a heuristic (the 4Rs reflection framework), the final-year Bachelor of Education students were able to gain a deeper understanding of the subject and experience transformational learning. We provide contextual information about the Service-learning program and discuss critical social theory for transformational learning, as well as how the teaching team taught critical reflection. Based on the evidence gathered from the students, we consider lessons learned by the teaching team and provide recommendations for teaching reflection in Service-learning programs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Compression is desirable for network applications as it saves bandwidth; however, when data is compressed before being encrypted, the amount of compression leaks information about the amount of redundancy in the plaintext. This side channel has led to successful CRIME and BREACH attacks on web traffic protected by the Transport Layer Security (TLS) protocol. The general guidance in light of these attacks has been to disable compression, preserving confidentiality but sacrificing bandwidth. In this paper, we examine two techniques - heuristic separation of secrets and fixed-dictionary compression|for enabling compression while protecting high-value secrets, such as cookies, from attack. We model the security offered by these techniques and report on the amount of compressibility that they can achieve.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Railway capacity determination and expansion are very important topics. In prior research, the competition between different entities such as train services and train types, on different network corridors however have been ignored, poorly modelled, or else assumed to be static. In response, a comprehensive set of multi-objective models have been formulated in this article to perform a trade-off analysis. These models determine the total absolute capacity of railway networks as the most equitable solution according to a clearly defined set of competing objectives. The models also perform a sensitivity analysis of capacity with respect to those competing objectives. The models have been extensively tested on a case study and their significant worth is shown. The models were solved using a variety of techniques however an adaptive E constraint method was shown to be most superior. In order to identify only the best solution, a Simulated Annealing meta-heuristic was implemented and tested. However a linearization technique based upon separable programming was also developed and shown to be superior in terms of solution quality but far less in terms of computational time.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Most real-life data analysis problems are difficult to solve using exact methods, due to the size of the datasets and the nature of the underlying mechanisms of the system under investigation. As datasets grow even larger, finding the balance between the quality of the approximation and the computing time of the heuristic becomes non-trivial. One solution is to consider parallel methods, and to use the increased computational power to perform a deeper exploration of the solution space in a similar time. It is, however, difficult to estimate a priori whether parallelisation will provide the expected improvement. In this paper we consider a well-known method, genetic algorithms, and evaluate on two distinct problem types the behaviour of the classic and parallel implementations.