876 resultados para Multi objective evolutionary algorithms


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The research project takes place within the technology acceptability framework which tries to understand the use made of new technologies, and concentrates more specifically on the factors that influence multi-touch devices’ (MTD) acceptance and intention to use. Why be interested in MTD? Nowadays, this technology is used in all kinds of human activities, e.g. leisure, study or work activities (Rogowski and Saeed, 2012). However, the handling or the data entry by means of gestures on multi-touch-sensitive screen imposes a number of constraints and consequences which remain mostly unknown (Park and Han, 2013). Currently, few researches in ergonomic psychology wonder about the implications of these new human-computer interactions on task fulfillment.This research project aims to investigate the cognitive, sensori-motor and motivational processes taking place during the use of those devices. The project will analyze the influences of the use of gestures and the type of gesture used: simple or complex gestures (Lao, Heng, Zhang, Ling, and Wang, 2009), as well as the personal self-efficacy feeling in the use of MTD on task engagement, attention mechanisms and perceived disorientation (Chen, Linen, Yen, and Linn, 2011) when confronted to the use of MTD. For that purpose, the various above-mentioned concepts will be measured within a usability laboratory (U-Lab) with self-reported methods (questionnaires) and objective indicators (physiological indicators, eye tracking). Globally, the whole research aims to understand the processes at stakes, as well as advantages and inconveniences of this new technology, to favor a better compatibility and adequacy between gestures, executed tasks and MTD. The conclusions will allow some recommendations for the use of the DMT in specific contexts (e.g. learning context).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The paper describes the design of an efficient and robust genetic algorithm for the nuclear fuel loading problem (i.e., refuellings: the in-core fuel management problem) - a complex combinatorial, multimodal optimisation., Evolutionary computation as performed by FUELGEN replaces heuristic search of the kind performed by the FUELCON expert system (CAI 12/4), to solve the same problem. In contrast to the traditional genetic algorithm which makes strong requirements on the representation used and its parameter setting in order to be efficient, the results of recent research results on new, robust genetic algorithms show that representations unsuitable for the traditional genetic algorithm can still be used to good effect with little parameter adjustment. The representation presented here is a simple symbolic one with no linkage attributes, making the genetic algorithm particularly easy to apply to fuel loading problems with differing core structures and assembly inventories. A nonlinear fitness function has been constructed to direct the search efficiently in the presence of the many local optima that result from the constraint on solutions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In many practical situations, batching of similar jobs to avoid setups is performed while constructing a schedule. This paper addresses the problem of non-preemptively scheduling independent jobs in a two-machine flow shop with the objective of minimizing the makespan. Jobs are grouped into batches. A sequence independent batch setup time on each machine is required before the first job is processed, and when a machine switches from processing a job in some batch to a job of another batch. Besides its practical interest, this problem is a direct generalization of the classical two-machine flow shop problem with no grouping of jobs, which can be solved optimally by Johnson's well-known algorithm. The problem under investigation is known to be NP-hard. We propose two O(n logn) time heuristic algorithms. The first heuristic, which creates a schedule with minimum total setup time by forcing all jobs in the same batch to be sequenced in adjacent positions, has a worst-case performance ratio of 3/2. By allowing each batch to be split into at most two sub-batches, a second heuristic is developed which has an improved worst-case performance ratio of 4/3. © 1998 The Mathematical Programming Society, Inc. Published by Elsevier Science B.V.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The issues surrounding collision of projectiles with structures has gained a high profile since the events of 11th September 2001. In such collision problems, the projectile penetrates the stucture so that tracking the interface between one material and another becomes very complex, especially if the projectile is essentially a vessel containing a fluid, e.g. fuel load. The subsequent combustion, heat transfer and melting and re-solidification process in the structure render this a very challenging computational modelling problem. The conventional approaches to the analysis of collision processes involves a Lagrangian-Lagrangian contact driven methodology. This approach suffers from a number of disadvantages in its implementation, most of which are associated with the challenges of the contact analysis component of the calculations. This paper describes a 'two fluid' approach to high speed impact between solid structures, where the objective is to overcome the problems of penetration and re-meshing. The work has been carried out using the finite volume, unstructured mesh multi-physics code PHYSICA+, where the three dimensional fluid flow, free surface, heat transfer, combustion, melting and re-solidification algorithms are approximated using cell-centred finite volume, unstructured mesh techniques on a collocated mesh. The basic procedure is illustrated for two cases of Newtonian and non-Newtonian flow to test various of its component capabilities in the analysis of problems of industrial interest.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

FEA and CFD analysis is becoming ever more complex with an emerging demand for simulation software technologies that can address ranges of problems that involve combinations of interactions amongst varying physical phenomena over a variety of time and length scales. Computation modelling of such problems requires software technologies that enable the representation of these complex suites of 'physical' interactions. This functionality requires the structuring of simulation modules for specific physical phemonmena so that the coupling can be effectiely represented. These 'multi-physics' and 'multi-scale' computations are very compute intensive and so the simulation software must operate effectively in parallel if it is to be used in this context. Of course the objective of 'multi-physics' and 'multi-scale' simulation is the optimal design of engineered systems so optimistation is an important feature of such classes of simulation. In this presentation, a multi-disciplinary approach to simulation based optimisation is described with some key examples of application to challenging engineering problems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this work is to present a new scheme for temperature-solute coupling in a solidification model, where the temperature and concentration fields simultaneously satisfy the macro-scale transport equations and, in the mushy region, meet the constraints imposed by the thermodynamics and the local scale processes. A step-by-step explanation of the macrosegregation algorithm, implemented in the finite volume unstructured mesh multi-physics modelling code PHYSICA, is initially presented and then the proposed scheme is validated against experimental results obtained by Krane for binary and a ternary alloys.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The graph-partitioning problem is to divide a graph into several pieces so that the number of vertices in each piece is the same within some defined tolerance and the number of cut edges is minimised. Important applications of the problem arise, for example, in parallel processing where data sets need to be distributed across the memory of a parallel machine. Very effective heuristic algorithms have been developed for this problem which run in real-time, but it is not known how good the partitions are since the problem is, in general, NP-complete. This paper reports an evolutionary search algorithm for finding benchmark partitions. A distinctive feature is the use of a multilevel heuristic algorithm to provide an effective crossover. The technique is tested on several example graphs and it is demonstrated that our method can achieve extremely high quality partitions significantly better than those found by the state-of-the-art graph-partitioning packages.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider a variety of preemptive scheduling problems with controllable processing times on a single machine and on identical/uniform parallel machines, where the objective is to minimize the total compression cost. In this paper, we propose fast divide-and-conquer algorithms for these scheduling problems. Our approach is based on the observation that each scheduling problem we discuss can be formulated as a polymatroid optimization problem. We develop a novel divide-and-conquer technique for the polymatroid optimization problem and then apply it to each scheduling problem. We show that each scheduling problem can be solved in $ \O({\rm T}_{\rm feas}(n) \times\log n)$ time by using our divide-and-conquer technique, where n is the number of jobs and Tfeas(n) denotes the time complexity of the corresponding feasible scheduling problem with n jobs. This approach yields faster algorithms for most of the scheduling problems discussed in this paper.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Image inpainting refers to restoring a damaged image with missing information. The total variation (TV) inpainting model is one such method that simultaneously fills in the regions with available information from their surroundings and eliminates noises. The method works well with small narrow inpainting domains. However there remains an urgent need to develop fast iterative solvers, as the underlying problem sizes are large. In addition one needs to tackle the imbalance of results between inpainting and denoising. When the inpainting regions are thick and large, the procedure of inpainting works quite slowly and usually requires a significant number of iterations and leads inevitably to oversmoothing in the outside of the inpainting domain. To overcome these difficulties, we propose a solution for TV inpainting method based on the nonlinear multi-grid algorithm.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Seasonal and inter-annual variations in phytoplankton community abundance in the Bay of Biscay are studied. Preliminarily processed by the National Aeronautics and Space Administration (NASA) to yield normalized water-leaving radiance and the top-of-the-atmosphere solar radiance, Sea-viewing Wide Field-of-View Sensor (SeaWiFS), Moderate Resolution Imaging Spectroradiometer (MODIS), and Coastal Zone Color Scanner (CZCS) data are further supplied to our dedicated retrieval algorithms to infer the sought for parameters. By applying the National Oceanic and Atmospheric Administration's (NOAA's) Advanced Very High Resolution Radiometer (AVHRR) data, the surface reflection coefficient in the only band in the visible spectrum is derived and employed for analysis. Decadal bridged time series of variations of diatom-dominated phytoplankton and green dinoflagellate Lepidodinium chlorophorum within the shelf zone and the coccolithophore Emiliania huxleyi in the pelagic area of the Bay are documented and analysed in terms of impacts of some biogeochemical and geophysical forcing factors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Satellite-derived remote-sensing reflectance (Rrs) can be used for mapping biogeochemically relevant variables, such as the chlorophyll concentration and the Inherent Optical Properties (IOPs) of the water, at global scale for use in climate-change studies. Prior to generating such products, suitable algorithms have to be selected that are appropriate for the purpose. Algorithm selection needs to account for both qualitative and quantitative requirements. In this paper we develop an objective methodology designed to rank the quantitative performance of a suite of bio-optical models. The objective classification is applied using the NASA bio-Optical Marine Algorithm Dataset (NOMAD). Using in situRrs as input to the models, the performance of eleven semi-analytical models, as well as five empirical chlorophyll algorithms and an empirical diffuse attenuation coefficient algorithm, is ranked for spectrally-resolved IOPs, chlorophyll concentration and the diffuse attenuation coefficient at 489 nm. The sensitivity of the objective classification and the uncertainty in the ranking are tested using a Monte-Carlo approach (bootstrapping). Results indicate that the performance of the semi-analytical models varies depending on the product and wavelength of interest. For chlorophyll retrieval, empirical algorithms perform better than semi-analytical models, in general. The performance of these empirical models reflects either their immunity to scale errors or instrument noise in Rrs data, or simply that the data used for model parameterisation were not independent of NOMAD. Nonetheless, uncertainty in the classification suggests that the performance of some semi-analytical algorithms at retrieving chlorophyll is comparable with the empirical algorithms. For phytoplankton absorption at 443 nm, some semi-analytical models also perform with similar accuracy to an empirical model. We discuss the potential biases, limitations and uncertainty in the approach, as well as additional qualitative considerations for algorithm selection for climate-change studies. Our classification has the potential to be routinely implemented, such that the performance of emerging algorithms can be compared with existing algorithms as they become available. In the long-term, such an approach will further aid algorithm development for ocean-colour studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Assigning uncertainty to ocean-color satellite products is a requirement to allow informed use of these data. Here, uncertainty estimates are derived using the comparison on a 12th-degree grid of coincident daily records of the remote-sensing reflectance RRS obtained with the same processing chain from three satellite missions, MERIS, MODIS and SeaWiFS. The approach is spatially resolved and produces σ, the part of the RRS uncertainty budget associated with random effects. The global average of σ decreases with wavelength from approximately 0.7– 0.9 10−3 sr−1 at 412 nm to 0.05–0.1 10−3 sr−1 at the red band, with uncertainties on σ evaluated as 20–30% between 412 and 555 nm, and 30–40% at 670 nm. The distribution of σ shows a restricted spatial variability and small variations with season, which makes the multi-annual global distribution of σ an estimate applicable to all retrievals of the considered missions. The comparison of σ with other uncertainty estimates derived from field data or with the support of algorithms provides a consistent picture. When translated in relative terms, and assuming a relatively low bias, the distribution of σ suggests that the objective of a 5% uncertainty is fulfilled between 412 and 490 nm for oligotrophic waters (chlorophyll-a concentration below 0.1 mg m−3). This study also provides comparison statistics. Spectrally, the mean absolute relative difference between RRS from different missions shows a characteristic U-shape with both ends at blue and red wavelengths inversely related to the amplitude of RRS. On average and for the considered data sets, SeaWiFS RRS tend to be slightly higher than MODIS RRS, which in turn appear higher than MERIS RRS. Biases between mission-specific RRS may exhibit a seasonal dependence, particularly in the subtropical belt.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a multi-language framework to FPGA hardware development which aims to satisfy the dual requirement of high-level hardware design and efficient hardware implementation. The central idea of this framework is the integration of different hardware languages in a way that harnesses the best features of each language. This is illustrated in this paper by the integration of two hardware languages in the form of HIDE: a structured hardware language which provides more abstract and elegant hardware descriptions and compositions than are possible in traditional hardware description languages such as VHDL or Verilog, and Handel-C: an ANSI C-like hardware language which allows software and hardware engineers alike to target FPGAs from high-level algorithmic descriptions. On the one hand, HIDE has proven to be very successful in the description and generation of highly optimised parameterisable FPGA circuits from geometric descriptions. On the other hand, Handel-C has also proven to be very successful in the rapid design and prototyping of FPGA circuits from algorithmic application descriptions. The proposed integrated framework hence harnesses HIDE for the generation of highly optimised circuits for regular parts of algorithms, while Handel-C is used as a top-level design language from which HIDE functionality is dynamically invoked. The overall message of this paper posits that there need not be an exclusive choice between different hardware design flows. Rather, an integrated framework where different design flows can seamlessly interoperate should be adopted. Although the idea might seem simple prima facie, it could have serious implications on the design of future generations of hardware languages.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A well-cited paper suggesting fuzzy coding as an alternative to the conventional binary, grey and floating-point representations used in genetic algorithms.