892 resultados para Multi-Criteria Optimization


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: To determine the activity and tolerability of SAM496A, an inhibitor of S-adenosylmethionine decarboxylase (SAMDC), in patients with metastatic melanoma who had not received prior chemotherapy. Selected patients were offered participation in two sub-studies examining early changes in tumor metabolism with FDG-PET and changes in tumor polyamine content. Patients and methods: Fifteen patients with measurable metastatic melanoma, normal cardiac function, and no known CNS metastases were eligible and received SAM486A by 1-hour IV infusion daily for 5 days every 3 weeks. Response was assessed by SWOG criteria. Results: No patient had a confirmed partial response. Fatigue/lethargy, myalgia and neutropenia were the main toxicities but no febrile neutropenia or grade 4 non-hematological toxicity occurred. Five patients had PET scans pre-treatment and on days 8-12 of cycle 1. No patient had reduction of tumor metabolism. Serial biopsy in one patient showed alterations in polyamines consistent with SAMDC inhibition. Conclusions: Using the present dose and schedule of administration, SAM486A does not have significant therapeutic potential in patients with metastatic melanoma.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The use of gene guns in ballistically delivering DNA vaccine coated gold micro-particles to skin can potentially damage targeted cells, therefore influencing transfection efficiencies. In this paper, we assess cell death in the viable epidermis by non-invasive near infrared two-photon microscopy following micro-particle bombardment of murine skin. We show that the ballistic delivery of micro-particles to the viable epidermis can result in localised cell death. Furthermore, experimental results show the degree of cell death is dependant on the number of micro-particles delivered per unit of tissue surface area. Micro-particles densities of 0.16 +/- 0.27 (mean +/- S.D.), 1.35 +/- 0.285 and 2.72 +/- 0.47 per 1000 mu m(2) resulted in percent deaths of 3.96 +/- 5.22, 45.91 +/- 10.89, 90.52 +/- 12.28, respectively. These results suggest that optimization of transfection by genes administered with gene guns is - among other effects - a compromise of micro-particle payload and cell death. (c) 2005 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Multiresolution (or multi-scale) techniques make it possible for Web-based GIS applications to access large dataset. The performance of such systems relies on data transmission over network and multiresolution query processing. In the literature the latter has received little research attention so far, and the existing methods are not capable of processing large dataset. In this paper, we aim to improve multiresolution query processing in an online environment. A cost model for such query is proposed first, followed by three strategies for its optimization. Significant theoretical improvement can be observed when comparing against available methods. Application of these strategies is also discussed, and similar performance enhancement can be expected if implemented in online GIS applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There is currently considerable interest in developing general non-linear density models based on latent, or hidden, variables. Such models have the ability to discover the presence of a relatively small number of underlying `causes' which, acting in combination, give rise to the apparent complexity of the observed data set. Unfortunately, to train such models generally requires large computational effort. In this paper we introduce a novel latent variable algorithm which retains the general non-linear capabilities of previous models but which uses a training procedure based on the EM algorithm. We demonstrate the performance of the model on a toy problem and on data from flow diagnostics for a multi-phase oil pipeline.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Physical distribution plays an imporant role in contemporary logistics management. Both satisfaction level of of customer and competitiveness of company can be enhanced if the distribution problem is solved optimally. The multi-depot vehicle routing problem (MDVRP) belongs to a practical logistics distribution problem, which consists of three critical issues: customer assignment, customer routing, and vehicle sequencing. According to the literatures, the solution approaches for the MDVRP are not satisfactory because some unrealistic assumptions were made on the first sub-problem of the MDVRP, ot the customer assignment problem. To refine the approaches, the focus of this paper is confined to this problem only. This paper formulates the customer assignment problem as a minimax-type integer linear programming model with the objective of minimizing the cycle time of the depots where setup times are explicitly considered. Since the model is proven to be MP-complete, a genetic algorithm is developed for solving the problem. The efficiency and effectiveness of the genetic algorithm are illustrated by a numerical example.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper develops and applies an integrated multiple criteria decision making approach to optimize the facility location-allocation problem in the contemporary customer-driven supply chain. Unlike the traditional optimization techniques, the proposed approach, combining the analytic hierarchy process (AHP) and the goal programming (GP) model, considers both quantitative and qualitative factors, and also aims at maximizing the benefits of deliverer and customers. In the integrated approach, the AHP is used first to determine the relative importance weightings or priorities of alternative locations with respect to both deliverer oriented and customer oriented criteria. Then, the GP model, incorporating the constraints of system, resource, and AHP priority is formulated to select the best locations for setting up the warehouses without exceeding the limited available resources. In this paper, a real case study is used to demonstrate how the integrated approach can be applied to deal with the facility location-allocation problem, and it is proved that the integrated approach outperforms the traditional costbased approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Computer-Based Learning systems of one sort or another have been in existence for almost 20 years, but they have yet to achieve real credibility within Commerce, Industry or Education. A variety of reasons could be postulated for this, typically: - cost - complexity - inefficiency - inflexibility - tedium Obviously different systems deserve different levels and types of criticism, but it still remains true that Computer-Based Learning (CBL) is falling significantly short of its potential. Experience of a small, but highly successful CBL system within a large, geographically distributed industry (the National Coal Board) prompted an investigation into currently available packages, the original intention being to purchase the most suitable software and run it on existing computer hardware, alongside existing software systems. It became apparent that none of the available CBL packages were suitable, and a decision was taken to develop an in-house Computer-Assisted Instruction system according to the following criteria: - cheap to run; - easy to author course material; - easy to use; - requires no computing knowledge to use (as either an author or student) ; - efficient in the use of computer resources; - has a comprehensive range of facilities at all levels. This thesis describes the initial investigation, resultant observations and the design, development and implementation of the SCHOOL system. One of the principal characteristics c£ SCHOOL is that it uses a hierarchical database structure for the storage of course material - thereby providing inherently a great deal of the power, flexibility and efficiency originally required. Trials using the SCHOOL system on IBM 303X series equipment are also detailed, along with proposed and current development work on what is essentially an operational CBL system within a large-scale Industrial environment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research involves a study of the questions, "what is considered safe", how are safety levels defined or decided, and according to whom. Tolerable or acceptable risk questions raise various issues: about values and assumptions inherent in such levels; about decision-making frameworks at the highest level of policy making as well as on the individual level; and about the suitability and competency of decision-makers to decide and to communicate their decisions. The wide-ranging topics covering philosophical and practical concerns examined in the literature review reveal the multi-disciplined scope of this research. To support this theoretical study empirical research was undertaken at the European Space Research and Technology Centre (ESTEC) of the European Space Agency (ESA). ESTEC is a large, multi-nationality, high technology organisation which presented an ideal case study for exploring how decisions are made with respect to safety from a personal as well as organisational aspect. A qualitative methodology was employed to gather, analyse and report the findings of this research. Significant findings reveal how experts perceive risks and the prevalence of informal decision-making processes partly due to the inadequacy of formal methods for deciding risk tolerability. In the field of occupational health and safety, this research has highlighted the importance and need for criteria to decide whether a risk is great enough to warrant attention in setting standards and priorities for risk control and resources. From a wider perspective and with the recognition that risk is an inherent part of life, the establishment of tolerability risk levels can be viewed as cornerstones indicating our progress, expectations and values, of life and work, in an increasingly litigious, knowledgeable and global society.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper explores the use of the optimization procedures in SAS/OR software with application to the ordered weight averaging (OWA) operators of decision-making units (DMUs). OWA was originally introduced by Yager (IEEE Trans Syst Man Cybern 18(1):183-190, 1988) has gained much interest among researchers, hence many applications such as in the areas of decision making, expert systems, data mining, approximate reasoning, fuzzy system and control have been proposed. On the other hand, the SAS is powerful software and it is capable of running various optimization tools such as linear and non-linear programming with all type of constraints. To facilitate the use of OWA operator by SAS users, a code was implemented. The SAS macro developed in this paper selects the criteria and alternatives from a SAS dataset and calculates a set of OWA weights. An example is given to illustrate the features of SAS/OWA software. © Springer-Verlag 2009.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper addresses the problem of obtaining 3d detailed reconstructions of human faces in real-time and with inexpensive hardware. We present an algorithm based on a monocular multi-spectral photometric-stereo setup. This system is known to capture high-detailed deforming 3d surfaces at high frame rates and without having to use any expensive hardware or synchronized light stage. However, the main challenge of such a setup is the calibration stage, which depends on the lights setup and how they interact with the specific material being captured, in this case, human faces. For this purpose we develop a self-calibration technique where the person being captured is asked to perform a rigid motion in front of the camera, maintaining a neutral expression. Rigidity constrains are then used to compute the head's motion with a structure-from-motion algorithm. Once the motion is obtained, a multi-view stereo algorithm reconstructs a coarse 3d model of the face. This coarse model is then used to estimate the lighting parameters with a stratified approach: In the first step we use a RANSAC search to identify purely diffuse points on the face and to simultaneously estimate this diffuse reflectance model. In the second step we apply non-linear optimization to fit a non-Lambertian reflectance model to the outliers of the previous step. The calibration procedure is validated with synthetic and real data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The non-linear programming algorithms for the minimum weight design of structural frames are presented in this thesis. The first, which is applied to rigidly jointed and pin jointed plane frames subject to deflexion constraints, consists of a search in a feasible design space. Successive trial designs are developed so that the feasibility and the optimality of the designs are improved simultaneously. It is found that this method is restricted lo the design of structures with few unknown variables. The second non-linear programming algorithm is presented .in a general form. This consists of two types of search, one improving feasibility and the other optimality. The method speeds up the 'feasible direction' approach by obtaining a constant weight direction vector that is influenced by dominating constraints. For pin jointed plane and space frames this method is used to obtain a 'minimum weight' design which satisfies restrictions on stresses and deflexions. The matrix force method enables the design requirements to be expressed in a general form and the design problem is automatically formulated within the computer. Examples are given to explain the method and the design criteria are extended to include member buckling. Fundamental theorems are proposed and proved to confirm that structures are inter-related. These theorems are applicable to linear elastic structures and facilitate the prediction of the behaviour of one structure from the results of analysing another, more general, or related structure. It becomes possible to evaluate the significance of each member in the behaviour of a structure and the problem of minimum weight design is extended to include shape. A method is proposed to design structures of optimum shape with stress and deflexion limitations. Finally a detailed investigation is carried out into the design of structures to study the factors that influence their shape.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is shown that any multicriteria problem can be represented by a hierarchical system. Separate properties of the object are evaluated at the lower level of the system, using a criteria vector, and a composition mechanism is used to evaluate the object as a whole at the upper level. The paper proposes a method to solve complex multicriteria problems of evaluation and optimization. It is based on nested scalar convolutions of vector- valued criteria and allows simple structural and parametrical synthesis of multicriteria hierarchical systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The paper considers vector discrete optimization problem with linear fractional functions of criteria on a feasible set that has combinatorial properties of combinations. Structural properties of a feasible solution domain and of Pareto–optimal (efficient), weakly efficient, strictly efficient solution sets are examined. A relation between vector optimization problems on a combinatorial set of combinations and on a continuous feasible set is determined. One possible approach is proposed in order to solve a multicriteria combinatorial problem with linear- fractional functions of criteria on a set of combinations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The problems and methods for adaptive control and multi-agent processing of information in global telecommunication and computer networks (TCN) are discussed. Criteria for controllability and communication ability (routing ability) of dataflows are described. Multi-agent model for exchange of divided information resources in global TCN has been suggested. Peculiarities for adaptive and intelligent control of dataflows in uncertain conditions and network collisions are analyzed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a novel approach to the computation of primitive geometrical structures, where no prior knowledge about the visual scene is available and a high level of noise is expected. We based our work on the grouping principles of proximity and similarity, of points and preliminary models. The former was realized using Minimum Spanning Trees (MST), on which we apply a stable alignment and goodness of fit criteria. As for the latter, we used spectral clustering of preliminary models. The algorithm can be generalized to various model fitting settings, without tuning of run parameters. Experiments demonstrate the significant improvement in the localization accuracy of models in plane, homography and motion segmentation examples. The efficiency of the algorithm is not dependent on fine tuning of run parameters like most others in the field.