885 resultados para Multi-objective analysis


Relevância:

80.00% 80.00%

Publicador:

Resumo:

A theoretical model allows for the characterization and optimization of the intra-cavity pulse evolutions in high-power fiber lasers. Multi-parameter analysis of laser performance can be made at a fraction of the computational cost. © 2010 Optical Society of America.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The central goal of this research is to explore the approach of the Islamic banking industry in defining and implementing religious compliance at regulatory, institutional, and individual level within the Islamic Banking and Finance (IBF) industry. It also examines the discrepancies, ambiguities and paradoxes that are exhibited in the individual and institutional behaviour in relation to the infusion and enactment of religious exigencies into compliance processes in IBF. Through the combined lenses of institutional work and a sensemaking perspective, this research portrays the practice of infusion of Islamic law in Islamic banks as being ambiguous and drifting down to the institutional and actor levels. In instances of both well-codified and non-codified regulatory frameworks for Shariah compliance, institutional rules ambiguity, rules interpretation and enactment ambiguities were found to be prevalent. The individual IBF professionals performed retrospective and prospective actions to adjust the role and rules boundaries both in the case of a Muslim and a non-Muslim country. The sensitizing concept of religious compliance is the primary theoretical contribution of this research and provides a tool to understand the nature of what constitutes Shariah compliance and the dynamics of its implementation. It helps to explain the empirical consequences of the lack of a clear definition of Shariah compliance in the regulatory frameworks and standards available for the industry. It also addresses the calls to have a clear reference on what constitute Shariah compliance in IBF as proposed in previous studies (Hayat, Butter, & Kock, 2013; Maurer, 2003, 2012; Pitluck, 2012). The methodological and theoretical perspective of this research are unique in the use of multi-level analysis and approaches that blend micro and macro perspectives of the research field, to illuminate and provide a more complete picture of religious compliance infusion and enactment in IBF.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper the effects of introducing novelty search in evolutionary art are explored. Our algorithm combines fitness and novelty metrics to frame image evolution as a multi-objective optimisation problem, promoting the creation of images that are both suitable and diverse. The method is illustrated by using two evolutionary art engines for the evolution of figurative objects and context free design grammars. The results demonstrate the ability of the algorithm to obtain a larger set of fit images compared to traditional fitness-based evolution, regardless of the engine used.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The article presents a new type of logs merging tool for multiple blade telecommunication systems based on the development of a new approach. The introduction of the new logs merging tool (the Log Merger) can help engineers to build a processes behavior timeline with a flexible system of information structuring used to assess the changes in the analyzed system. This logs merging system based on the experts experience and their analytical skills generates a knowledge base which could be advantageous in further decision-making expert system development. This paper proposes and discusses the design and implementation of the Log Merger, its architecture, multi-board analysis of capability and application areas. The paper also presents possible ways of further tool improvement e.g. - to extend its functionality and cover additional system platforms. The possibility to add an analysis module for further expert system development is also considered.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

When visual sensor networks are composed of cameras which can adjust the zoom factor of their own lens, one must determine the optimal zoom levels for the cameras, for a given task. This gives rise to an important trade-off between the overlap of the different cameras’ fields of view, providing redundancy, and image quality. In an object tracking task, having multiple cameras observe the same area allows for quicker recovery, when a camera fails. In contrast having narrow zooms allow for a higher pixel count on regions of interest, leading to increased tracking confidence. In this paper we propose an approach for the self-organisation of redundancy in a distributed visual sensor network, based on decentralised multi-objective online learning using only local information to approximate the global state. We explore the impact of different zoom levels on these trade-offs, when tasking omnidirectional cameras, having perfect 360-degree view, with keeping track of a varying number of moving objects. We further show how employing decentralised reinforcement learning enables zoom configurations to be achieved dynamically at runtime according to an operator’s preference for maximising either the proportion of objects tracked, confidence associated with tracking, or redundancy in expectation of camera failure. We show that explicitly taking account of the level of overlap, even based only on local knowledge, improves resilience when cameras fail. Our results illustrate the trade-off between maintaining high confidence and object coverage, and maintaining redundancy, in anticipation of future failure. Our approach provides a fully tunable decentralised method for the self-organisation of redundancy in a changing environment, according to an operator’s preferences.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Силвия К. Баева, Цветана Хр. Недева - Важен аспект в системата на Министерството на регионалното развитие и благоустройство е работата по Оперативна програма “Регионално развитие” с приоритетна ос “Устойчиво и интегрирано градско развитие” по операция “Подобряване на физическата среда и превенция на риска”. По тази програма са включени 86 общини. Финансовият ресурс на тази операция е на стойност 238 589 939 евро, от които 202 801 448 евро са европейско финансиране [1]. Всяка от тези 86 общини трябва да реши задачата за възлагане на обществена поръчка на определена фирма по тази операция. Всъщност, тази задача е задача за провеждане на общински търг за избор на фирма-изпълнител. Оптималният избор на фирма-изпълнител е много важен. Задачата за провеждане на търг ще формулираме като задача на многокритериалното вземане на решения, като чрез подходящо изграждане на критерии и методи може да се трансформира до задача на еднокритериалната оптимизация.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This thesis chronicles the design and implementation of a Internet/Intranet and database based application for the quality control of hurricane surface wind observations. A quality control session consists of selecting desired observation types to be viewed and determining a storm track based time window for viewing the data. All observations of the selected types are then plotted in a storm relative view for the chosen time window and geography is positioned for the storm-center time about which an objective analysis can be performed. Users then make decisions about data validity through visual nearest-neighbor comparison and inspection. The project employed an Object Oriented iterative development method from beginning to end and its implementation primarily features the Java programming language. ^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

An emergency is a deviation from a planned course of events that endangers people, properties, or the environment. It can be described as an unexpected event that causes economic damage, destruction, and human suffering. When a disaster happens, Emergency Managers are expected to have a response plan to most likely disaster scenarios. Unlike earthquakes and terrorist attacks, a hurricane response plan can be activated ahead of time, since a hurricane is predicted at least five days before it makes landfall. This research looked into the logistics aspects of the problem, in an attempt to develop a hurricane relief distribution network model. We addressed the problem of how to efficiently and effectively deliver basic relief goods to victims of a hurricane disaster. Specifically, where to preposition State Staging Areas (SSA), which Points of Distributions (PODs) to activate, and the allocation of commodities to each POD. Previous research has addressed several of these issues, but not with the incorporation of the random behavior of the hurricane's intensity and path. This research presents a stochastic meta-model that deals with the location of SSAs and the allocation of commodities. The novelty of the model is that it treats the strength and path of the hurricane as stochastic processes, and models them as Discrete Markov Chains. The demand is also treated as stochastic parameter because it depends on the stochastic behavior of the hurricane. However, for the meta-model, the demand is an input that is determined using Hazards United States (HAZUS), a software developed by the Federal Emergency Management Agency (FEMA) that estimates losses due to hurricanes and floods. A solution heuristic has been developed based on simulated annealing. Since the meta-model is a multi-objective problem, the heuristic is a multi-objective simulated annealing (MOSA), in which the initial solution and the cooling rate were determined via a Design of Experiments. The experiment showed that the initial temperature (T0) is irrelevant, but temperature reduction (δ) must be very gradual. Assessment of the meta-model indicates that the Markov Chains performed as well or better than forecasts made by the National Hurricane Center (NHC). Tests of the MOSA showed that it provides solutions in an efficient manner. Thus, an illustrative example shows that the meta-model is practical.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The purpose of this thesis was to identify the optimal design parameters for a jet nozzle which obtains a local maximum shear stress while maximizing the average shear stress on the floor of a fluid filled system. This research examined how geometric parameters of a jet nozzle, such as the nozzle's angle, height, and orifice, influence the shear stress created on the bottom surface of a tank. Simulations were run using a Computational Fluid Dynamics (CFD) software package to determine shear stress values for a parameterized geometric domain including the jet nozzle. A response surface was created based on the shear stress values obtained from 112 simulated designs. A multi-objective optimization software utilized the response surface to generate designs with the best combination of parameters to achieve maximum shear stress and maximum average shear stress. The optimal configuration of parameters achieved larger shear stress values over a commercially available design.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Knowledge of cell electronics has led to their integration to medicine either by physically interfacing electronic devices with biological systems or by using electronics for both detection and characterization of biological materials. In this dissertation, an electrical impedance sensor (EIS) was used to measure the electrode surface impedance changes from cell samples of human and environmental toxicity of nanoscale materials in 2D and 3D cell culture models. The impedimetric response of human lung fibroblasts and rainbow trout gill epithelial cells when exposed to various nanomaterials was tested to determine their kinetic effects towards the cells and to demonstrate the biosensor's ability to monitor nanotoxicity in real-time. Further, the EIS allowed rapid, real-time and multi-sample analysis creating a versatile, noninvasive tool that is able to provide quantitative information with respect to alteration in cellular function. We then extended the application of the unique capabilities of the EIS to do real-time analysis of cancer cell response to externally applied alternating electric fields at different intermediate frequencies and low-intensity. Decreases in the growth profiles of the ovarian and breast cancer cells were observed with the application of 200 and 100 kHz, respectively, indicating specific inhibitory effects on dividing cells in culture in contrast to the non-cancerous HUVECs and mammary epithelial cells. We then sought to enhance the effects of the electric field by altering the cancer cell's electronegative membrane properties with HER2 antibody functionalized nanoparticles. An Annexin V/EthD-III assay and zeta potential were performed to determine the cell death mechanism indicating apoptosis and a decrease in zeta potential with the incorporation of the nanoparticles. With more negatively charged HER2-AuNPs attached to the cancer cell membrane, the decrease in membrane potential would thus leave the cells more vulnerable to the detrimental effects of the applied electric field due to the decrease in surface charge. Therefore, by altering the cell membrane potential, one could possibly control the fate of the cell. This whole cell-based biosensor will enhance our understanding of the responsiveness of cancer cells to electric field therapy and demonstrate potential therapeutic opportunities for electric field therapy in the treatment of cancer.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Mapping of vegetation patterns over large extents using remote sensing methods requires field sample collections for two different purposes: (1) the establishment of plant association classification systems from samples of relative abundance estimates; and (2) training for supervised image classification and accuracy assessment of satellite data derived maps. One challenge for both procedures is the establishment of confidence in results and the analysis across multiple spatial scales. Continuous data sets that enable cross-scale studies are very time consuming and expensive to acquire and such extensive field sampling can be invasive. The use of high resolution aerial photography (hrAP) offers an alternative to extensive, invasive, field sampling and can provide large volume, spatially continuous, reference information that can meet the challenges of confidence building and multi-scale analysis.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Many classical as well as modern optimization techniques exist. One such modern method belonging to the field of swarm intelligence is termed ant colony optimization. This relatively new concept in optimization involves the use of artificial ants and is based on real ant behavior inspired by the way ants search for food. In this thesis, a novel ant colony optimization technique for continuous domains was developed. The goal was to provide improvements in computing time and robustness when compared to other optimization algorithms. Optimization function spaces can have extreme topologies and are therefore difficult to optimize. The proposed method effectively searched the domain and solved difficult single-objective optimization problems. The developed algorithm was run for numerous classic test cases for both single and multi-objective problems. The results demonstrate that the method is robust, stable, and that the number of objective function evaluations is comparable to other optimization algorithms.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This thesis chronicles the design and implementation of a Intemet/Intranet and database based application for the quality control of hurricane surface wind observations. A quality control session consists of selecting desired observation types to be viewed and determining a storm track based time window for viewing the data. All observations of the selected types are then plotted in a storm relative view for the chosen time window and geography is positioned for the storm-center time about which an objective analysis can be performed. Users then make decisions about data validity through visual nearestneighbor comparison and inspection. The project employed an Object Oriented iterative development method from beginning to end and its implementation primarily features the Java programming language.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Thirty seven deep-sea sediment cores from the Arabian Sea were studied geochemically (49 major and trace elements) for four time slices during the Holocene and the last glacial, and in one high sedimentation rate core (century scale resolution) to detect tracers of past variations in the intensity of the atmospheric monsoon circulation and its hydrographic expression in the ocean surface. This geochemical multi-tracer approach, coupled with additional information on the grain size composition of the clastic fraction, the bulk carbonate and biogenic opal contents makes it possible to characterize the sedimentological regime in detail. Sediments characterized by a specific elemental composition (enrichment) originated from the following sources: river suspensions from the Tapti and Narbada, draining the Indian Deccan traps (Ti, Sr); Indus sediments and dust from Rajasthan and Pakistan (Rb, Cs); dust from Iran and the Persian Gulf (Al, Cr); dust from central Arabia (Mg); dust from East Africa and the Red Sea (Zr/Hf, Ti/Al). Corg, Cd, Zn, Ba, Pb, U, and the HREE are associated with the intensity of upwelling in the western Arabian Sea, but only those patterns that are consistently reproduced by all of these elements can be directly linked with the intensity of the southwest monsoon. Relying on information from a single element can be misleading, as each element is affected by various other processes than upwelling intensity and nutrient content of surface water alone. The application of the geochemical multi-tracer approach indicates that the intensity of the southwest monsoon was low during the LGM, declined to a minimum from 15,000-13,000 14C year BP, intensified slightly at the end of this interval, was almost stable during the Bölling, Alleröd and the Younger Dryas, but then intensified in two abrupt successions at the end of the Younger Dryas (9900 14C year BP) and especially in a second event during the early Holocene (8800 14C year BP). Dust discharge by northwesterly winds from Arabia exhibited a similar evolution, but followed an opposite course: high during the LGM with two primary sources-the central Arabian desert and the dry Persian Gulf region. Dust discharge from both regions reached a pronounced maximum at 15,000-13,000 14C year. At the end of this interval, however, the dust plumes from the Persian Gulf area ceased dramatically, whereas dust discharge from central Arabia decreased only slightly. Dust discharge from East Africa and the Red Sea increased synchronously with the two major events of southwest monsoon intensification as recorded in the nutrient content of surface waters. In addition to the tracers of past dust flux and surface water nutrient content, the geochemical multi-tracer approach provides information on the history of deep sea ventilation (Mo, S), which was much lower during the last glacial maximum than during the Holocene. The multi-tracer approach-i.e. a few sedimentological parameters plus a set of geochemical tracers widely available from various multi-element analysis techniques-is a highly applicable technique for studying the complex sedimentation patterns of an ocean basin, and, specifically in the case of the Arabian Sea, can even reveal the seasonal structure of climate change.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A uniform chronology for foraminifera-based sea surface temperature records has been established in more than 120 sediment cores obtained from the equatorial and eastern Atlantic up to the Arctic Ocean. The chronostratigraphy of the last 30,000 years is mainly based on published d18O records and 14C ages from accelerator mass spectrometry, converted into calendar-year ages. The high-precision age control provides the database necessary for the uniform reconstruction of the climate interval of the Last Glacial Maximum within the GLAMAP-2000 project.