51 resultados para Mixed Binary Linear Programming


Relevância:

100.00% 100.00%

Publicador:

Resumo:

We propose a new technique to perform unsupervised data classification (clustering) based on density induced metric and non-smooth optimization. Our goal is to automatically recognize multidimensional clusters of non-convex shape. We present a modification of the fuzzy c-means algorithm, which uses the data induced metric, defined with the help of Delaunay triangulation. We detail computation of the distances in such a metric using graph algorithms. To find optimal positions of cluster prototypes we employ the discrete gradient method of non-smooth optimization. The new clustering method is capable to identify non-convex overlapped d-dimensional clusters.


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper estimates productivity growth in Malaysian manufacturing over the period 1983-1999. Malmquist productivity Indices (MPIs) have been computed using non parametric Data Envelopment Analysis (DEA) type linear programming, which show productivity growth sourced from efficiency change and growth in technology. Unlike previous studies, this study identifies the sources of productivity growth in Malaysian manufacturing industries at the five digit breakdown of Malaysian Standard Industrial Classification (MSIC) thereby revealing more industry specific efficiency and technical growth patterns. Results indicated that a high majority of the industries operated with low levels of technical efficiency with little or no improvement over time. Growth estimates revealed that two third of the industries (76 out of total 114 categories) experienced average annual productivity improvement ranging from 0.1% to 7.8%. Average annual technical progress was recorded by 95 industry categories while technical efficiency improvement was achieved by 53 industries. Overall yearly average indicated relatively low productivity growth from the mid 1990’s onwards caused by either efficiency decline or technical regress. Summary results for industries showed that some of the high rates of productivity growth have been recorded in glass and glass products (7.3%), Petroleum and coal (7.2%), industrial chemicals (4.9%) contributed from both efficiency improvement and technical progress ranging from 0.8% to 5.4% and from 1.7% to 4.1%, respectively. These results are expected to have some implications for ongoing and future strategic policy reform in Malaysian manufacturing generating a more sustainable growth for specific industry categories.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes a new approach to multivariate scattered data smoothing. It is assumed that the data are generated by a Lipschitz continuous function f, and include random noise to be filtered out. The proposed approach uses known, or estimated value of the Lipschitz constant of f, and forces the data to be consistent with the Lipschitz properties of f. Depending on the assumptions about the distribution of the random noise, smoothing is reduced to a standard quadratic or a linear programming problem. We discuss an efficient algorithm which eliminates the redundant inequality constraints. Numerical experiments illustrate applicability and efficiency of the method. This approach provides an efficient new tool of multivariate scattered data approximation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Data Envelope Analysis, (DEA), a linear programming technique, provides a more consistent measure of efficiency than the commonly cited partial measures of farm efficiency. It yields a relative measure of efficiency and identifies inputs or outputs that are under utilized.

In this paper, DEA is used to assess the technical efficiency of a sample of dairy farms across all dairy regions in Australia. Regions vary in size and scale of operation and are examined to see the relationship between farm size and technical efficiency, and to see if there is justification in the move towards bigger dairy production units than currently exist, given their factor mix.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We examine numerical performance of various methods of calculation of the Conditional Value-at-risk (CVaR), and portfolio optimization with respect to this risk measure. We concentrate on the method proposed by Rockafellar and Uryasev in (Rockafellar, R.T. and Uryasev, S., 2000, Optimization of conditional value-at-risk. Journal of Risk, 2, 21-41), which converts this problem to that of convex optimization. We compare the use of linear programming techniques against a non-smooth optimization method of the discrete gradient, and establish the supremacy of the latter. We show that non-smooth optimization can be used efficiently for large portfolio optimization, and also examine parallel execution of this method on computer clusters.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper estimates productivity growth in Malaysian manufacturing over the period 1983-1999. Malmquist productivity Indices (MPIs) have been computed using non parametric Data Envelopment Analysis (DEA) type linear programming, which show productivity growth sourced from efficiency change and growth in technology. Unlike previous studies, this study identifies the Malaysian manufacturing industries at the five digit breakdown of Malaysian Standard Industrial Classification (MSIC) thereby revealing more industry specific efficiency and technical growth patterns. Results indicate that two third of the industries (76 out of total 114 categories) experienced average annual
productivity improvement ranging from 0.1% to 7.8% over the sampled period. Average annual technical progress was recorded by 95 industry categories while technical efficiency improvement was achieved by 53 industries. Overall yearly average indicated relatively low productivity growth from the mid 1990’s onwards caused by either efficiency decline or technical regress. Summary results for industries reveal that some of the high rates of productivity growth have been recorded in glass and glass products (7.3%), Petroleum and coal (7.2%), industrial chemicals (4.9%) contributed from both efficiency improvement and technical progress ranging from 0.8% to 5.4% and from 1.7% to 4.1%, respectively. These results are expected to have some implications for ongoing and future strategic policy reform in Malaysian manufacturing generating a more sustainable growth for specific industry categories.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Efficiency measurement is at the heart of most management accounting functions. Data envelopment analysis (DEA) is a linear programming technique used to measure relative efficiency of organisational units referred in DEA literature as decision making units (DMUs). Universities are complex organisations involving multiple inputs and outputs (Abbott & Doucouliagos, 2008). There is no agreement in identifying and measuring the inputs and outputs of higher education institutes (Avkiran, 2001). Hence, accurate efficiency measurement in such complex institutes needs rigorous research.

Prior DEA studies have investigated the application of the technique at university (Avkiran, 2001; Abbott & Doucouliagos, 2003; Abbott & Doucouliagos, 2008) or department/school (Beasley, 1990; Sinuany-Stern, Mehrez & Barboy, 1994) levels. The organisational unit that has control and hence the responsibility over inputs and outputs is the most appropriate decision making unit (DMU) for DEA to provide useful managerial information. In the current study, DEA has been applied at faculty level for two reasons. First, in the case university, as with most other universities, inputs and outputs are more accurately identified with faculties than departments/schools. Second, efficiency results at university level are highly aggregated and do not provide detail managerial information.

Prior DEA time series studies have used input and output cost and income data without adjusting for changes in time value of money. This study examines the effects of adjusting financial data for changes in dollar values without proportional changes in the quantity of the inputs and the outputs. The study is carried out mainly from management accounting perspective. It is mainly focused on the use of the DEA efficiency information for managerial decision purposes. It is not intended to contribute to the theoretical development of the linear programming model. It takes the view that one does not need to be a mechanic to be a good car driver.

The results suggest that adjusting financial input and output data in time series analysis change efficiency values, rankings, reference set as well as projection amounts. The findings also suggest that the case University could have saved close to $10 million per year if all faculties had operated efficiently. However, it is also recognised that quantitative performance measures have their own limitations and should be used cautiously.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper examines children’s multiplatform commissioning at the Australian Broadcasting Corporation (ABC) in the context of the digitalisation of Australian television. A pursuit of audience share and reach to legitimise its recurrent funding engenders a strategy that prioritises the entertainment values of the ABC’s children’s offerings. Nevertheless, these multiplatform texts (comprising complementary ‘on-air’ and ‘online’ textualities) evidence a continuing commitment to a youth-focussed, public service remit, and reflect the ABC’s Charter obligations to foster innovation, creativity, participation, citizenship, and the values of social inclusiveness. The analysis focuses on two recent ‘marquee’ drama projects, Dance Academy (a contemporary teen series) and My Place (a historical series for a middle childhood audience). The research draws on a series of research interviews, analysis of policy documents and textual analysis of the television and multiplatform content. The authors argue that a mixed diet of programming, together with an educative or social developmental agenda, features in the design of both program and online participation for the public broadcaster.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Hybrid electric vehicles are powered by an electric system and an internal combustion engine. The components of a hybrid electric vehicle need to be coordinated in an optimal manner to deliver the desired performance. This paper presents an approach based on direct method for optimal power management in hybrid electric vehicles with inequality constraints. The approach consists of reducing the optimal control problem to a set of algebraic equations by approximating the state variable which is the energy of electric storage, and the control variable which is the power of fuel consumption. This approximation uses orthogonal functions with unknown coefficients. In addition, the inequality constraints are converted to equal constraints. The advantage of the developed method is that its computational complexity is less than that of dynamic and non-linear programming approaches. Also, to use dynamic or non-linear programming, the problem should be discretized resulting in the loss of optimization accuracy. The propsed method, on the other hand, does not require the discretization of the problem producing more accurate results. An example is solved to demonstrate the accuracy of the proposed approach. The results of Haar wavelets, and Chebyshev and Legendre polynomials are presented and discussed. © 2011 The Korean Society of Automotive Engineers and Springer-Verlag Berlin Heidelberg.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The practice of solely relying on the human resources department in the selection process of external training providers has cast doubts and mistrust across other departments as to how trainers are sourced. There are no measurable criteria used by human resource personnel, since most decisions are based on intuitive experience and subjective market knowledge. The present problem focuses on outsourcing of private training programs that are partly government funded, which has been facing accountability challenges. Due to the unavailability of a scientific decision-making approach in this context, a 12-step algorithm is proposed and tested in a Japanese multinational company. The model allows the decision makers to revise their criteria expectations, in turn witnessing the change of the training providers' quota distribution. Finally, this multi-objective sensitivity analysis provides a forward-looking approach to training needs planning and aids decision makers in their sourcing strategy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Multicast is an important mechanism in modern wireless networks and has attracted significant efforts to improve its performance with different metrics including throughput, delay, energy efficiency, etc. Traditionally, an ideal loss-free channel model is widely used to facilitate routing protocol design. However, the quality of wireless links would be affected or even jeopardized by many factors like collisions, fading or the noise of environment. In this paper, we propose a reliable multicast protocol, called CodePipe, with advanced performance in terms of energy-efficiency, throughput and fairness in lossy wireless networks. Built upon opportunistic routing and random linear network coding, CodePipe not only simplifies transmission coordination between nodes, but also improves the multicast throughput significantly by exploiting both intra-batch and inter-batch coding opportunities. In particular, four key techniques, namely, LP-based opportunistic routing structure, opportunistic feeding, fast batch moving and inter-batch coding, are proposed to offer substantial improvement in throughput, energy-efficiency and fairness. We evaluate CodePipe on ns2 simulator by comparing with other two state-of-art multicast protocols, MORE and Pacifier. Simulation results show that CodePipe significantly outperforms both of them.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a projection pursuit (PP) based method for blind separation of nonnegative sources. First, the available observation matrix is mapped to construct a new mixing model, in which the inaccessible source matrix is normalized to be column-sum-to-1. Then, the PP method is proposed to solve this new model, where the mixing matrix is estimated column by column through tracing the projections to the mapped observations in specified directions, which leads to the recovery of the sources. The proposed method is much faster than Chan's method, which has similar assumptions to ours, due to the usage of optimal projection. It is also more advantageous in separating cross-correlated sources than the independence- and uncorrelation-based methods, as it does not employ any statistical information of the sources. Furthermore, the new method does not require the mixing matrix to be nonnegative. Simulation results demonstrate the superior performance of our method.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Blind source separation (BSS) has been widely discussed in many real applications. Recently, under the assumption that both of the sources and the mixing matrix are nonnegative, Wang develop an amazing BSS method by using volume maximization. However, the algorithm that they have proposed can guarantee the nonnegativities of the sources only, but cannot obtain a nonnegative mixing matrix necessarily. In this letter, by introducing additional constraints, a method for fully nonnegative constrained iterative volume maximization (FNCIVM) is proposed. The result is with more interpretation, while the algorithm is based on solving a single linear programming problem. Numerical experiments with synthetic signals and real-world images are performed, which show the effectiveness of the proposed method.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The ever-growing cellular traffic demand has laid a heavy burden on cellular networks. The recent rapid development in vehicle-to-vehicle communication techniques makes vehicular delay-tolerant network (VDTN) an attractive candidate for traffic offloading from cellular networks. In this paper, we study a bulk traffic offloading problem with the goal of minimizing the cellular communication cost under the constraint that all the subscribers receive their desired whole content before it expires. It needs to determine the initial offloading points and the dissemination scheme for offloaded traffic in a VDTN. By novelly describing the content delivery process via a contact-based flow model, we formulate the problem in a linear programming (LP) form, based on which an online offloading scheme is proposed to deal with the network dynamics (e.g., vehicle arrival/departure). Furthermore, an offline LP-based
analysis is derived to obtain the optimal solution. The high efficiency of our online algorithm is extensively validated by simulation results.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we develop a data-driven weight learning method for weighted quasi-arithmetic means where the observed data may vary in dimension.