993 resultados para Pareto Analysis


Relevância:

60.00% 60.00%

Publicador:

Resumo:

The implementation of pavement management seems to ignore road safety, with its focus being mainly on infrastructure condition. Safety management as part of pavement management should consider various means of reducing the frequency of vehicle crashes by allocating corrective measures to mitigate accident exposure, as well as reduce accident severity and likelihood. However, it is common that lack of accident records and crash contributing factors impedes incorporating safety into pavement management. This paper presents a case study for the initial development of pavement management systems considering data limitations for 3000 km of Tanzania’s national roads. A performance based optimization utilizes indices for safety and surface condition to allocate corrective measures. A modified Pareto analysis capable of accounting for annual performance and of balancing resources to achieve good surface condition and low levels of safety was applied. Tradeoff analysis for the case study found the need to assign 30% relevance to condition and 70% to road safety. Safety and condition deficiencies were corrected within 5 years with the majority of improvements dedicated to surface treatments and some geometric corrections. Large investments for correcting geometric issues were observed in years two and three if more money was made available.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The influence of process variables (pea starch, guar gum and glycerol) on the viscosity (V), solubility (SOL), moisture content (MC), transparency (TR), Hunter parameters (L, a, and b), total color difference (ΔE), yellowness index (YI), and whiteness index (WI) of the pea starch based edible films was studied using three factors with three level Box–Behnken response surface design. The individual linear effect of pea starch, guar and glycerol was significant (p < 0.05) on all the responses. However, a value was only significantly (p < 0.05) affected by pea starch and guar gum in a positive and negative linear term, respectively. The effect of interaction of starch × glycerol was also significant (p < 0.05) on TR of edible films. Interaction between independent variables starch × guar gum had a significant impact on the b and YI values. The quadratic regression coefficient of pea starch showed a significant effect (p < 0.05) on V, MC, L, b, ΔE, YI, and WI; glycerol level on ΔE and WI; and guar gum on ΔE and SOL value. The results were analyzed by Pareto analysis of variance (ANOVA) and the second order polynomial models were developed from the experimental design with reliable and satisfactory fit with the corresponding experimental data and high coefficient of determination (R2) values (>0.93). Three-dimensional response surface plots were established to investigate the relationship between process variables and the responses. The optimized conditions with the goal of maximizing TR and minimizing SOL, YI and MC were 2.5 g pea starch, 25% glycerol and 0.3 g guar gum. Results revealed that pea starch/guar gum edible films with appropriate physical and optical characteristics can be effectively produced and successfully applied in the food packaging industry.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

El presente estudio expone metodologías tomadas como estrategias las cuales están basadas en la filosofía Lean, estas buscan la reducción de desperdicios acompañado de un mejoramiento continuo. La metodología se tomó como un modelo de gestión para la empresa Casa de Banquetes Álvaro O. Castañeda, específicamente su proceso productivo. Por medio de un análisis inicial del flujo de valor y el análisis de evaluación de oportunidades Lean, se estableció un plan de flujo de valor con unas fases que contemplan objetivos, metas, tiempos de ejecución, responsables, proceso afectado para la mejora continua y un tiempo de seguimiento, con el fin último de aumentar el valor agregado de servicio al cliente.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Developing an efficient methodology for oil recovery is extremely important . Within the range of enh anced oil recovery, known as EOR, the injection of polymer solutions becomes effective in controlling the mobility of displacing fluid . This method consists of adding polymers to the injection water to increase its viscosity, so that more water diffuses in to the porous medium and increasing the sweep efficiency in the reservoir. This work is studied by numerical simulation , application of the injection polymer solution in a homogeneous reservoir , semisynthetic with similar characteristics to the reservoirs of the Brazilian Northeast , numerical simulations were performed using thermal simulator STARS from CMG (Computer Modelling Group ). The study aimed to analyze the influence of some parameters on the behavior of reservoir oil production, with the response to cumulative production. Simulations were performed to analyze the influence of water injection, polymer solution and alternating injection of water banks and polymer solution, comparing the results for each simulated condition. The primary outcomes were: oil viscosity, percentage of injected polymer, polymer viscosity and flow rate of water injection. The evaluation of the influence of variables consisted of a complete experimental design followed a Pareto analysis for the purpose of pointing out which va riables would be most influential on the response represented b y the cumulative oil production . It was found that all variables significantly influenced the recovery of oil and the injection of polymer solution on an ongoing basis is more efficient for the cumulative production compared to oil recovery by continuous water injection. The primary recovery show ed low levels of oil production , water injection significantly improves the pro duction of oil in the reservoir , but the injection of polymer solution em erges as a new methodology to increase the production of oil, increasing the life of the well and possible reduction of water produced.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In today’s society, IT-Companies often have a hard time estimating changed requirements. This leads to that the clients’ confidence is negatively affected and is one of the main reasons why this has to be improved. The goal with this study was to find out what the most common problems regarding this issue are in IT-companies that works with agile software development. By analyzing one IT-company through a SWOT- and pareto-analysis the most common problems have been ascertained. The SWOT analysis have been created through interviews with selected employees to get a better understanding of the problems that the IT-company is facing. Furthermore was the pareto-analysis based on a survey that was sent out to many different employees to prioritize the problems. The reason why the survey was sent to different employees was to get a more objective input. The study showed that there was many different problems that needed attention. The most important problems was that the communication towards the client regarding requirements needed to be improved, better communication internally between different departments needed to be established, a method to quickly adapt and estimate change in requirements needed to be implemented and finally a method regarding witch key employees whom need to attend the planning of the program backlog. These problems have then been studied through interviews with other IT-companies and through a literature study. The conclusions that where drawn was that the client needs to be involved and updated through the whole project. Constant monitoring and communication regarding changed requirements needs to be processed and mediated. High standards needs to be set early towards the client in order to obtain as clear an image of the requirements as possible. Many different parties need to attend to the planning process for the program backlog before the start of the project. The client needs to be aware of that changed requirements will arise and that this will lead to that the first estimation may not necessarily be absolute. As long as the client is held up to date as well as participant through the whole project and problems are detected and mediated early, change in requirements should not be a huge problem. This is after all the purpose of being agile.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The individual and interactive impacts of guar gum and glycerol on the pea starch-based edible film characteristics were examined using three factors with three level Box–Behnken response surface design. The results showed that density and elongation at break were only significantly (p < 0.05) affected by pea starch and guar gum in a positive linear fashion. The quadratic regression coefficient of pea starch showed a significant effect (p < 0.05) on thickness, density, puncture force, water vapour permeability, and tensile strength. While tensile strength and Young modulus affected by the quadratic regression coefficient of glycerol and guar gum, respectively. The results were analysed using Pareto analysis of variance (ANOVA) and the developed predictive equations for each response variable presented reliable and satisfactory fit with high coefficient of determination (R2) values (≥ 0.96). The optimized conditions with the goal of maximizing mechanical properties and minimizing water vapour permeability were 2.5 g pea starch, 0.3 g guar gum and 25 % (w/w) glycerol based on the dry film matter in 100 ml of distilled water. Generally, changes in the concentrations of pea starch, guar gum and glycerol resulted in changes in the functional properties of film.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Over the past ten years, a variety of microRNA target prediction methods has been developed, and many of the methods are constantly improved and adapted to recent insights into miRNA-mRNA interactions. In a typical scenario, different methods return different rankings of putative targets, even if the ranking is reduced to selected mRNAs that are related to a specific disease or cell type. For the experimental validation it is then difficult to decide in which order to process the predicted miRNA-mRNA bindings, since each validation is a laborious task and therefore only a limited number of mRNAs can be analysed. We propose a new ranking scheme that combines ranked predictions from several methods and - unlike standard thresholding methods - utilises the concept of Pareto fronts as defined in multi-objective optimisation. In the present study, we attempt a proof of concept by applying the new ranking scheme to hsa-miR-21, hsa-miR-125b, and hsa-miR-373 and prediction scores supplied by PITA and RNAhybrid. The scores are interpreted as a two-objective optimisation problem, and the elements of the Pareto front are ranked by the STarMir score with a subsequent re-calculation of the Pareto front after removal of the top-ranked mRNA from the basic set of prediction scores. The method is evaluated on validated targets of the three miRNA, and the ranking is compared to scores from DIANA-microT and TargetScan. We observed that the new ranking method performs well and consistent, and the first validated targets are elements of Pareto fronts at a relatively early stage of the recurrent procedure. which encourages further research towards a higher-dimensional analysis of Pareto fronts. (C) 2010 Elsevier Ltd. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper presents a technique for performing analog design synthesis at circuit level providing feedback to the designer through the exploration of the Pareto frontier. A modified simulated annealing which is able to perform crossover with past anchor points when a local minimum is found which is used as the optimization algorithm on the initial synthesis procedure. After all specifications are met, the algorithm searches for the extreme points of the Pareto frontier in order to obtain a non-exhaustive exploration of the Pareto front. Finally, multi-objective particle swarm optimization is used to spread the results and to find a more accurate frontier. Piecewise linear functions are used as single-objective cost functions to produce a smooth and equal convergence of all measurements to the desired specifications during the composition of the aggregate objective function. To verify the presented technique two circuits were designed, which are: a Miller amplifier with 96 dB Voltage gain, 15.48 MHz unity gain frequency, slew rate of 19.2 V/mu s with a current supply of 385.15 mu A, and a complementary folded cascode with 104.25 dB Voltage gain, 18.15 MHz of unity gain frequency and a slew rate of 13.370 MV/mu s. These circuits were synthesized using a 0.35 mu m technology. The results show that the method provides a fast approach for good solutions using the modified SA and further good Pareto front exploration through its connection to the particle swarm optimization algorithm.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Index-flood related regional frequency analysis (RFA) procedures are in use by hydrologists to estimate design quantiles of hydrological extreme events at data sparse/ungauged locations in river basins. There is a dearth of attempts to establish which among those procedures is better for RFA in the L-moment framework. This paper evaluates the performance of the conventional index flood (CIF), the logarithmic index flood (LIF), and two variants of the population index flood (PIF) procedures in estimating flood quantiles for ungauged locations by Monte Carlo simulation experiments and a case study on watersheds in Indiana in the U.S. To evaluate the PIF procedure, L-moment formulations are developed for implementing the procedure in situations where the regional frequency distribution (RFD) is the generalized logistic (GLO), generalized Pareto (GPA), generalized normal (GNO) or Pearson type III (PE3), as those formulations are unavailable. Results indicate that one of the variants of the PIF procedure, which utilizes the regional information on the first two L-moments is more effective than the CIF and LIF procedures. The improvement in quantile estimation using the variant of PIF procedure as compared with the CIF procedure is significant when the RFD is a generalized extreme value, GLO, GNO, or PE3, and marginal when it is GPA. (C) 2015 American Society of Civil Engineers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In decision making problems where we need to choose a particular decision or alternative from a set of possible choices, we often have some preferences which determine if we prefer one decision over another. When these preferences give us an ordering on the decisions that is complete, then it is easy to choose the best or one of the best decisions. However it often occurs that the preferences relation is partially ordered, and we have no best decision. In this thesis, we look at what happens when we have such a partial order over a set of decisions, in particular when we have multiple orderings on a set of decisions, and we present a framework for qualitative decision making. We look at the different natural notions of optimal decision that occur in this framework, which gives us different optimality classes, and we examine the relationships between these classes. We then look in particular at a qualitative preference relation called Sorted-Pareto Dominance, which is an extension of Pareto Dominance, and we give a semantics for this relation as one that is compatible with any order-preserving mapping of an ordinal preference scale to a numerical one. We apply Sorted-Pareto dominance to a Soft Constraints setting, where we solve problems in which the soft constraints associate qualitative preferences to decisions in a decision problem. We also examine the Sorted-Pareto dominance relation in the context of our qualitative decision making framework, looking at the relevant optimality classes for the Sorted-Pareto case, which gives us classes of decisions that are necessarily optimal, and optimal for some choice of mapping of an ordinal scale to a quantitative one. We provide some empirical analysis of Sorted-Pareto constraints problems and examine the optimality classes that result.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Forest fires can cause extensive damage to natural resources and properties. They can also destroy wildlife habitat, affect the forest ecosystem and threaten human lives. In this paper extreme wildland fires are analysed using a point process model for extremes. The model based on a generalised Pareto distribution is used to model data on acres of wildland burnt by extreme fire in the US since 1825. A semi-parametric smoothing approach is adapted with maximum likelihood method to estimate model parameters.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Forest fires can cause extensive damage to natural resources and properties. They can also destroy wildlife habitat, affect the forest ecosystem and threaten human lives. In this paper incidences of extreme wildland fires are modelled by a point process model which incorporates time-trend. A model based on a generalised Pareto distribution is used to model data on acres of wildland burnt by extreme fire in the US since 1825. A semi-parametric smoothing approach, which is very useful in exploratory analysis of changes in extremes, is illustrated with the maximum likelihood method to estimate model parameters.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recently there has been an increasing interest in the development of new methods using Pareto optimality to deal with multi-objective criteria (for example, accuracy and architectural complexity). Once one has learned a model based on their devised method, the problem is then how to compare it with the state of art. In machine learning, algorithms are typically evaluated by comparing their performance on different data sets by means of statistical tests. Unfortunately, the standard tests used for this purpose are not able to jointly consider performance measures. The aim of this paper is to resolve this issue by developing statistical procedures that are able to account for multiple competing measures at the same time. In particular, we develop two tests: a frequentist procedure based on the generalized likelihood-ratio test and a Bayesian procedure based on a multinomial-Dirichlet conjugate model. We further extend them by discovering conditional independences among measures to reduce the number of parameter of such models, as usually the number of studied cases is very reduced in such comparisons. Real data from a comparison among general purpose classifiers is used to show a practical application of our tests.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work is an assessment of frequency of extreme values (EVs) of daily rainfall in the city of Sao Paulo. Brazil, over the period 1933-2005, based on the peaks-over-threshold (POT) and Generalized Pareto Distribution (GPD) approach. Usually. a GPD model is fitted to a sample of POT Values Selected With a constant threshold. However. in this work we use time-dependent thresholds, composed of relatively large p quantities (for example p of 0.97) of daily rainfall amounts computed from all available data. Samples of POT values were extracted with several Values of p. Four different GPD models (GPD-1, GPD-2, GPD-3. and GDP-4) were fitted to each one of these samples by the maximum likelihood (ML) method. The shape parameter was assumed constant for the four models, but time-varying covariates were incorporated into scale parameter of GPD-2. GPD-3, and GPD-4, describing annual cycle in GPD-2. linear trend in GPD-3, and both annual cycle and linear trend in GPD-4. The GPD-1 with constant scale and shape parameters is the simplest model. For identification of the best model among the four models WC used rescaled Akaike Information Criterion (AIC) with second-order bias correction. This criterion isolates GPD-3 as the best model, i.e. the one with positive linear trend in the scale parameter. The slope of this trend is significant compared to the null hypothesis of no trend, for about 98% confidence level. The non-parametric Mann-Kendall test also showed presence of positive trend in the annual frequency of excess over high thresholds. with p-value being virtually zero. Therefore. there is strong evidence that high quantiles of daily rainfall in the city of Sao Paulo have been increasing in magnitude and frequency over time. For example. 0.99 quantiles of daily rainfall amount have increased by about 40 mm between 1933 and 2005. Copyright (C) 2008 Royal Meteorological Society

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we present an algorithm for cluster analysis that integrates aspects from cluster ensemble and multi-objective clustering. The algorithm is based on a Pareto-based multi-objective genetic algorithm, with a special crossover operator, which uses clustering validation measures as objective functions. The algorithm proposed can deal with data sets presenting different types of clusters, without the need of expertise in cluster analysis. its result is a concise set of partitions representing alternative trade-offs among the objective functions. We compare the results obtained with our algorithm, in the context of gene expression data sets, to those achieved with multi-objective Clustering with automatic K-determination (MOCK). the algorithm most closely related to ours. (C) 2009 Elsevier B.V. All rights reserved.