64 resultados para Linear program model

em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The choice network revenue management model incorporates customer purchase behavioras a function of the offered products, and is the appropriate model for airline and hotel networkrevenue management, dynamic sales of bundles, and dynamic assortment optimization.The optimization problem is a stochastic dynamic program and is intractable. A certainty-equivalencerelaxation of the dynamic program, called the choice deterministic linear program(CDLP) is usually used to generate dyamic controls. Recently, a compact linear programmingformulation of this linear program was given for the multi-segment multinomial-logit (MNL)model of customer choice with non-overlapping consideration sets. Our objective is to obtaina tighter bound than this formulation while retaining the appealing properties of a compactlinear programming representation. To this end, it is natural to consider the affine relaxationof the dynamic program. We first show that the affine relaxation is NP-complete even for asingle-segment MNL model. Nevertheless, by analyzing the affine relaxation we derive a newcompact linear program that approximates the dynamic programming value function betterthan CDLP, provably between the CDLP value and the affine relaxation, and often comingclose to the latter in our numerical experiments. When the segment consideration sets overlap,we show that some strong equalities called product cuts developed for the CDLP remain validfor our new formulation. Finally we perform extensive numerical comparisons on the variousbounds to evaluate their performance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The choice network revenue management (RM) model incorporates customer purchase behavioras customers purchasing products with certain probabilities that are a function of the offeredassortment of products, and is the appropriate model for airline and hotel network revenuemanagement, dynamic sales of bundles, and dynamic assortment optimization. The underlyingstochastic dynamic program is intractable and even its certainty-equivalence approximation, inthe form of a linear program called Choice Deterministic Linear Program (CDLP) is difficultto solve in most cases. The separation problem for CDLP is NP-complete for MNL with justtwo segments when their consideration sets overlap; the affine approximation of the dynamicprogram is NP-complete for even a single-segment MNL. This is in contrast to the independentclass(perfect-segmentation) case where even the piecewise-linear approximation has been shownto be tractable. In this paper we investigate the piecewise-linear approximation for network RMunder a general discrete-choice model of demand. We show that the gap between the CDLP andthe piecewise-linear bounds is within a factor of at most 2. We then show that the piecewiselinearapproximation is polynomially-time solvable for a fixed consideration set size, bringing itinto the realm of tractability for small consideration sets; small consideration sets are a reasonablemodeling tradeoff in many practical applications. Our solution relies on showing that forany discrete-choice model the separation problem for the linear program of the piecewise-linearapproximation can be solved exactly by a Lagrangian relaxation. We give modeling extensionsand show by numerical experiments the improvements from using piecewise-linear approximationfunctions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose Encouraging office workers to ‘sit less and move more’ encompasses two public health priorities. However, there is little evidence on the effectiveness of workplace interventions for reducing sitting, even less about the longer term effects of such interventions and still less on dual-focused interventions. This study assessed the short and mid-term impacts of a workplace web-based intervention (Walk@WorkSpain, W@WS; 2010-11) on self-reported sitting time, step counts and physical risk factors (waist circumference, BMI, blood pressure) for chronic disease. Methods Employees at six Spanish university campuses (n=264; 42±10 years; 171 female) were randomly assigned by worksite and campus to an Intervention (used W@WS; n=129; 87 female) or a Comparison group (maintained normal behavior; n=135; 84 female). This phased, 19-week program aimed to decrease occupational sitting time through increased incidental movement and short walks. A linear mixed model assessed changes in outcome measures between the baseline, ramping (8 weeks), maintenance (11 weeks) and followup (two months) phases for Intervention versus Comparison groups.A significant 2 (group) × 2 (program phases) interaction was found for self-reported occupational sitting (F[3]=7.97, p=0.046), daily step counts (F[3]=15.68, p=0.0013) and waist circumference (F[3]=11.67, p=0.0086). The Intervention group decreased minutes of daily occupational sitting while also increasing step counts from baseline (446±126; 8,862±2,475) through ramping (+425±120; 9,345±2,435), maintenance (+422±123; 9,638±3,131) and follow-up (+414±129; 9,786±3,205). In the Comparison group, compared to baseline (404±106), sitting time remained unchanged through ramping and maintenance, but decreased at follow-up (-388±120), while step counts diminished across all phases. The Intervention group significantly reduced waist circumference by 2.1cms from baseline to follow-up while the Comparison group reduced waist circumference by 1.3cms over the same period. Conclusions W@WSis a feasible and effective evidence-based intervention that can be successfully deployed with sedentary employees to elicit sustained changes on “sitting less and moving more”.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper introduces the approach of using TURF analysis to design a product line through a binary linear programming model. This improves the efficiency of the search for the solution to the problem compared to the algorithms that have been used to date. Furthermore, the proposed technique enables the model to be improved in order to overcome the main drawbacks presented by TURF analysis in practice.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Most research on single machine scheduling has assumedthe linearity of job holding costs, which is arguablynot appropriate in some applications. This motivates ourstudy of a model for scheduling $n$ classes of stochasticjobs on a single machine, with the objective of minimizingthe total expected holding cost (discounted or undiscounted). We allow general holding cost rates that are separable,nondecreasing and convex on the number of jobs in eachclass. We formulate the problem as a linear program overa certain greedoid polytope, and establish that it issolved optimally by a dynamic (priority) index rule,whichextends the classical Smith's rule (1956) for the linearcase. Unlike Smith's indices, defined for each class, ournew indices are defined for each extended class, consistingof a class and a number of jobs in that class, and yieldan optimal dynamic index rule: work at each time on a jobwhose current extended class has larger index. We furthershow that the indices possess a decomposition property,as they are computed separately for each class, andinterpret them in economic terms as marginal expected cost rate reductions per unit of expected processing time.We establish the results by deploying a methodology recentlyintroduced by us [J. Niño-Mora (1999). "Restless bandits,partial conservation laws, and indexability. "Forthcomingin Advances in Applied Probability Vol. 33 No. 1, 2001],based on the satisfaction by performance measures of partialconservation laws (PCL) (which extend the generalizedconservation laws of Bertsimas and Niño-Mora (1996)):PCL provide a polyhedral framework for establishing theoptimality of index policies with special structure inscheduling problems under admissible objectives, which weapply to the model of concern.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper introduces the approach of using Total Unduplicated Reach and Frequency analysis (TURF) to design a product line through a binary linear programming model. This improves the efficiency of the search for the solution to the problem compared to the algorithms that have been used to date. The results obtained through our exact algorithm are presented, and this method shows to be extremely efficient both in obtaining optimal solutions and in computing time for very large instances of the problem at hand. Furthermore, the proposed technique enables the model to be improved in order to overcome the main drawbacks presented by TURF analysis in practice.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A simple holographic model is presented and analyzed that describes chiral symmetry breaking and the physics of the meson sector in QCD. This is a bottom-up model that incorporates string theory ingredients like tachyon condensation which is expected to be the main manifestation of chiral symmetry breaking in the holographic context. As a model for glue the Kuperstein-Sonnenschein background is used. The structure of the flavor vacuum is analyzed in the quenched approximation. Chiral symmetry breaking is shown at zero temperature. Above the deconfinement transition chiral symmetry is restored. A complete holographic renormalization is performed and the chiral condensate is calculated for different quark masses both at zero and non-zero temperatures. The 0++, 0¿+, 1++, 1¿¿ meson trajectories are analyzed and their masses and decay constants are computed. The asymptotic trajectories are linear. The model has one phenomenological parameter beyond those of QCD that affects the 1++, 0¿+ sectors. Fitting this parameter we obtain very good agreement with data. The model improves in several ways the popular hard-wall and soft wall bottom-up models.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background: MLPA method is a potentially useful semi-quantitative method to detect copy number alterations in targeted regions. In this paper, we propose a method for the normalization procedure based on a non-linear mixed-model, as well as a new approach for determining the statistical significance of altered probes based on linear mixed-model. This method establishes a threshold by using different tolerance intervals that accommodates the specific random error variability observed in each test sample.Results: Through simulation studies we have shown that our proposed method outperforms two existing methods that are based on simple threshold rules or iterative regression. We have illustrated the method using a controlled MLPA assay in which targeted regions are variable in copy number in individuals suffering from different disorders such as Prader-Willi, DiGeorge or Autism showing the best performace.Conclusion: Using the proposed mixed-model, we are able to determine thresholds to decide whether a region is altered. These threholds are specific for each individual, incorporating experimental variability, resulting in improved sensitivity and specificity as the examples with real data have revealed.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A simple holographic model is presented and analyzed that describes chiral symmetry breaking and the physics of the meson sector in QCD. This is a bottom-up model that incorporates string theory ingredients like tachyon condensation which is expected to be the main manifestation of chiral symmetry breaking in the holographic context. As a model for glue the Kuperstein-Sonnenschein background is used. The structure of the flavor vacuum is analyzed in the quenched approximation. Chiral symmetry breaking is shown at zero temperature. Above the deconfinement transition chiral symmetry is restored. A complete holographic renormalization is performed and the chiral condensate is calculated for different quark masses both at zero and non-zero temperatures. The 0++, 0¿+, 1++, 1¿¿ meson trajectories are analyzed and their masses and decay constants are computed. The asymptotic trajectories are linear. The model has one phenomenological parameter beyond those of QCD that affects the 1++, 0¿+ sectors. Fitting this parameter we obtain very good agreement with data. The model improves in several ways the popular hard-wall and soft wall bottom-up models.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A multiple-partners assignment game with heterogeneous sales and multiunit demands consists of a set of sellers that own a given number of indivisible units of (potentially many different) goods and a set of buyers who value those units and want to buy at most an exogenously fixed number of units. We define a competitive equilibrium for this generalized assignment game and prove its existence by using only linear programming. In particular, we show how to compute equilibrium price vectors from the solutions of the dual linear program associated to the primal linear program defined to find optimal assignments. Using only linear programming tools, we also show (i) that the set of competitive equilibria (pairs of price vectors and assignments) has a Cartesian product structure: each equilibrium price vector is part of a competitive equilibrium with all optimal assignments, and vice versa; (ii) that the set of (restricted) equilibrium price vectors has a natural lattice structure; and (iii) how this structure is translated into the set of agents' utilities that are attainable at equilibrium.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In an earlier investigation (Burger et al., 2000) five sediment cores near the RodriguesTriple Junction in the Indian Ocean were studied applying classical statistical methods(fuzzy c-means clustering, linear mixing model, principal component analysis) for theextraction of endmembers and evaluating the spatial and temporal variation ofgeochemical signals. Three main factors of sedimentation were expected by the marinegeologists: a volcano-genetic, a hydro-hydrothermal and an ultra-basic factor. Thedisplay of fuzzy membership values and/or factor scores versus depth providedconsistent results for two factors only; the ultra-basic component could not beidentified. The reason for this may be that only traditional statistical methods wereapplied, i.e. the untransformed components were used and the cosine-theta coefficient assimilarity measure.During the last decade considerable progress in compositional data analysis was madeand many case studies were published using new tools for exploratory analysis of thesedata. Therefore it makes sense to check if the application of suitable data transformations,reduction of the D-part simplex to two or three factors and visualinterpretation of the factor scores would lead to a revision of earlier results and toanswers to open questions . In this paper we follow the lines of a paper of R. Tolosana-Delgado et al. (2005) starting with a problem-oriented interpretation of the biplotscattergram, extracting compositional factors, ilr-transformation of the components andvisualization of the factor scores in a spatial context: The compositional factors will beplotted versus depth (time) of the core samples in order to facilitate the identification ofthe expected sources of the sedimentary process.Kew words: compositional data analysis, biplot, deep sea sediments

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Revenue management practices often include overbooking capacity to account for customerswho make reservations but do not show up. In this paper, we consider the network revenuemanagement problem with no-shows and overbooking, where the show-up probabilities are specificto each product. No-show rates differ significantly by product (for instance, each itinerary andfare combination for an airline) as sale restrictions and the demand characteristics vary byproduct. However, models that consider no-show rates by each individual product are difficultto handle as the state-space in dynamic programming formulations (or the variable space inapproximations) increases significantly. In this paper, we propose a randomized linear program tojointly make the capacity control and overbooking decisions with product-specific no-shows. Weestablish that our formulation gives an upper bound on the optimal expected total profit andour upper bound is tighter than a deterministic linear programming upper bound that appearsin the existing literature. Furthermore, we show that our upper bound is asymptotically tightin a regime where the leg capacities and the expected demand is scaled linearly with the samerate. We also describe how the randomized linear program can be used to obtain a bid price controlpolicy. Computational experiments indicate that our approach is quite fast, able to scale to industrialproblems and can provide significant improvements over standard benchmarks.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Models incorporating more realistic models of customer behavior, as customers choosing from an offerset, have recently become popular in assortment optimization and revenue management. The dynamicprogram for these models is intractable and approximated by a deterministic linear program called theCDLP which has an exponential number of columns. When there are products that are being consideredfor purchase by more than one customer segment, CDLP is difficult to solve since column generationis known to be NP-hard. However, recent research indicates that a formulation based on segments withcuts imposing consistency (SDCP+) is tractable and approximates the CDLP value very closely. In thispaper we investigate the structure of the consideration sets that make the two formulations exactly equal.We show that if the segment consideration sets follow a tree structure, CDLP = SDCP+. We give acounterexample to show that cycles can induce a gap between the CDLP and the SDCP+ relaxation.We derive two classes of valid inequalities called flow and synchronization inequalities to further improve(SDCP+), based on cycles in the consideration set structure. We give a numeric study showing theperformance of these cycle-based cuts.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

En este documento se ilustra de un modo práctico, el empleo de tres instrumentos que permiten al actuario definir grupos arancelarios y estimar premios de riesgo en el proceso que tasa la clase para el seguro de no vida. El primero es el análisis de segmentación (CHAID y XAID) usado en primer lugar en 1997 por UNESPA en su cartera común de coches. El segundo es un proceso de selección gradual con el modelo de regresión a base de distancia. Y el tercero es un proceso con el modelo conocido y generalizado de regresión linear, que representa la técnica más moderna en la bibliografía actuarial. De estos últimos, si combinamos funciones de eslabón diferentes y distribuciones de error, podemos obtener el aditivo clásico y modelos multiplicativos

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A common way to model multiclass classification problems is by means of Error-Correcting Output Codes (ECOCs). Given a multiclass problem, the ECOC technique designs a code word for each class, where each position of the code identifies the membership of the class for a given binary problem. A classification decision is obtained by assigning the label of the class with the closest code. One of the main requirements of the ECOC design is that the base classifier is capable of splitting each subgroup of classes from each binary problem. However, we cannot guarantee that a linear classifier model convex regions. Furthermore, nonlinear classifiers also fail to manage some type of surfaces. In this paper, we present a novel strategy to model multiclass classification problems using subclass information in the ECOC framework. Complex problems are solved by splitting the original set of classes into subclasses and embedding the binary problems in a problem-dependent ECOC design. Experimental results show that the proposed splitting procedure yields a better performance when the class overlap or the distribution of the training objects conceal the decision boundaries for the base classifier. The results are even more significant when one has a sufficiently large training size.